var/home/core/zuul-output/0000755000175000017500000000000015071207001014516 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015071231442015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005367027615071231432017713 0ustar rootrootOct 07 13:00:46 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 13:00:46 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:46 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:00:47 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 13:00:48 crc kubenswrapper[4959]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.470882 4959 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478240 4959 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478273 4959 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478280 4959 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478287 4959 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478295 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478303 4959 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478311 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478318 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478324 4959 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478330 4959 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478335 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478341 4959 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478347 4959 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478352 4959 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478358 4959 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478364 4959 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478370 4959 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478378 4959 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478385 4959 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478392 4959 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478398 4959 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478404 4959 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478410 4959 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478416 4959 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478422 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478428 4959 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478434 4959 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478440 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478452 4959 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478458 4959 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478465 4959 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478471 4959 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478477 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478483 4959 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478491 4959 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478497 4959 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478504 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478510 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478518 4959 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478524 4959 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478530 4959 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478537 4959 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478545 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478551 4959 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478559 4959 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478566 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478573 4959 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478579 4959 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478585 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478591 4959 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478598 4959 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478604 4959 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478612 4959 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478618 4959 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478647 4959 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478654 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478660 4959 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478667 4959 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478673 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478680 4959 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478686 4959 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478692 4959 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478699 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478705 4959 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478716 4959 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478725 4959 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478731 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478737 4959 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478743 4959 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478748 4959 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.478755 4959 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478912 4959 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478935 4959 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478955 4959 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478969 4959 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478980 4959 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.478989 4959 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479001 4959 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479011 4959 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479020 4959 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479027 4959 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479034 4959 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479040 4959 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479047 4959 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479055 4959 flags.go:64] FLAG: --cgroup-root="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479063 4959 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479071 4959 flags.go:64] FLAG: --client-ca-file="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479078 4959 flags.go:64] FLAG: --cloud-config="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479086 4959 flags.go:64] FLAG: --cloud-provider="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479093 4959 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479103 4959 flags.go:64] FLAG: --cluster-domain="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479111 4959 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479119 4959 flags.go:64] FLAG: --config-dir="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479127 4959 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479134 4959 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479141 4959 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479148 4959 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479154 4959 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479161 4959 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479169 4959 flags.go:64] FLAG: --contention-profiling="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479177 4959 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479185 4959 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479193 4959 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479200 4959 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479210 4959 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479219 4959 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479227 4959 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479235 4959 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479242 4959 flags.go:64] FLAG: --enable-server="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479250 4959 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479264 4959 flags.go:64] FLAG: --event-burst="100" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479272 4959 flags.go:64] FLAG: --event-qps="50" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479280 4959 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479288 4959 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479296 4959 flags.go:64] FLAG: --eviction-hard="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479306 4959 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479313 4959 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479321 4959 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479329 4959 flags.go:64] FLAG: --eviction-soft="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479336 4959 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479344 4959 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479352 4959 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479359 4959 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479367 4959 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479375 4959 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479383 4959 flags.go:64] FLAG: --feature-gates="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479399 4959 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479406 4959 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479414 4959 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479422 4959 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479430 4959 flags.go:64] FLAG: --healthz-port="10248" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479438 4959 flags.go:64] FLAG: --help="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479445 4959 flags.go:64] FLAG: --hostname-override="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479453 4959 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479461 4959 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479469 4959 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479477 4959 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479486 4959 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479494 4959 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479501 4959 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479509 4959 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479516 4959 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479524 4959 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479532 4959 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479539 4959 flags.go:64] FLAG: --kube-reserved="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479546 4959 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479555 4959 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479563 4959 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479571 4959 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479578 4959 flags.go:64] FLAG: --lock-file="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479585 4959 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479593 4959 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479601 4959 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479613 4959 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479620 4959 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479655 4959 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479663 4959 flags.go:64] FLAG: --logging-format="text" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479671 4959 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479679 4959 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479686 4959 flags.go:64] FLAG: --manifest-url="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479694 4959 flags.go:64] FLAG: --manifest-url-header="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479704 4959 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479712 4959 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479721 4959 flags.go:64] FLAG: --max-pods="110" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479729 4959 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479737 4959 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479744 4959 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479752 4959 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479760 4959 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479768 4959 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479776 4959 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479794 4959 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479802 4959 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479809 4959 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479817 4959 flags.go:64] FLAG: --pod-cidr="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479825 4959 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479837 4959 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479844 4959 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479852 4959 flags.go:64] FLAG: --pods-per-core="0" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479859 4959 flags.go:64] FLAG: --port="10250" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479867 4959 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479874 4959 flags.go:64] FLAG: --provider-id="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479883 4959 flags.go:64] FLAG: --qos-reserved="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479890 4959 flags.go:64] FLAG: --read-only-port="10255" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479898 4959 flags.go:64] FLAG: --register-node="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479909 4959 flags.go:64] FLAG: --register-schedulable="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479918 4959 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479931 4959 flags.go:64] FLAG: --registry-burst="10" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479938 4959 flags.go:64] FLAG: --registry-qps="5" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479945 4959 flags.go:64] FLAG: --reserved-cpus="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479952 4959 flags.go:64] FLAG: --reserved-memory="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479962 4959 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479970 4959 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479977 4959 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479985 4959 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.479992 4959 flags.go:64] FLAG: --runonce="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480000 4959 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480007 4959 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480015 4959 flags.go:64] FLAG: --seccomp-default="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480023 4959 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480030 4959 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480038 4959 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480046 4959 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480055 4959 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480062 4959 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480069 4959 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480077 4959 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480083 4959 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480090 4959 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480097 4959 flags.go:64] FLAG: --system-cgroups="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480103 4959 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480113 4959 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480119 4959 flags.go:64] FLAG: --tls-cert-file="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480124 4959 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480132 4959 flags.go:64] FLAG: --tls-min-version="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480138 4959 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480144 4959 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480151 4959 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480159 4959 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480165 4959 flags.go:64] FLAG: --v="2" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480173 4959 flags.go:64] FLAG: --version="false" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480182 4959 flags.go:64] FLAG: --vmodule="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480189 4959 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480196 4959 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480345 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480354 4959 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480360 4959 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480366 4959 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480372 4959 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480378 4959 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480384 4959 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480389 4959 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480394 4959 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480399 4959 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480405 4959 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480410 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480415 4959 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480421 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480427 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480433 4959 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480439 4959 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480445 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480451 4959 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480458 4959 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480464 4959 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480472 4959 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480478 4959 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480485 4959 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480494 4959 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480503 4959 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480512 4959 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480518 4959 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480525 4959 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480532 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480540 4959 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480547 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480555 4959 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480561 4959 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480566 4959 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480571 4959 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480577 4959 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480582 4959 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480587 4959 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480592 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480598 4959 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480606 4959 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480619 4959 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480649 4959 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480655 4959 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480660 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480666 4959 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480673 4959 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480680 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480686 4959 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480696 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480703 4959 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480710 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480717 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480724 4959 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480731 4959 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480737 4959 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480744 4959 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480753 4959 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480761 4959 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480768 4959 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480775 4959 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480783 4959 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480790 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480796 4959 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480803 4959 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480811 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480817 4959 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480822 4959 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480828 4959 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.480833 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.480843 4959 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.496069 4959 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.496131 4959 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496235 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496247 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496256 4959 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496263 4959 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496271 4959 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496277 4959 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496282 4959 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496287 4959 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496293 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496298 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496302 4959 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496308 4959 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496313 4959 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496318 4959 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496323 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496328 4959 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496332 4959 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496337 4959 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496342 4959 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496347 4959 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496352 4959 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496358 4959 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496363 4959 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496370 4959 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496375 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496381 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496386 4959 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496392 4959 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496397 4959 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496404 4959 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496414 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496419 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496425 4959 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496430 4959 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496437 4959 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496442 4959 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496448 4959 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496452 4959 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496457 4959 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496462 4959 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496467 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496472 4959 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496477 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496482 4959 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496487 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496492 4959 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496497 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496502 4959 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496507 4959 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496512 4959 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496517 4959 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496523 4959 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496529 4959 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496536 4959 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496541 4959 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496549 4959 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496556 4959 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496563 4959 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496569 4959 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496576 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496582 4959 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496588 4959 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496594 4959 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496600 4959 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496608 4959 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496614 4959 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496643 4959 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496650 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496656 4959 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496662 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496669 4959 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.496678 4959 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496855 4959 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496866 4959 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496872 4959 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496878 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496883 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496890 4959 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496894 4959 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496899 4959 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496906 4959 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496913 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496918 4959 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496923 4959 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496928 4959 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496933 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496938 4959 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496943 4959 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496950 4959 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496957 4959 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496963 4959 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496969 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496974 4959 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496980 4959 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496985 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.496992 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497000 4959 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497008 4959 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497015 4959 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497021 4959 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497028 4959 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497034 4959 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497040 4959 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497045 4959 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497050 4959 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497055 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497061 4959 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497065 4959 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497070 4959 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497075 4959 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497080 4959 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497086 4959 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497092 4959 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497098 4959 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497104 4959 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497110 4959 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497117 4959 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497124 4959 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497130 4959 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497136 4959 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497143 4959 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497148 4959 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497153 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497159 4959 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497164 4959 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497169 4959 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497174 4959 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497180 4959 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497186 4959 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497192 4959 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497198 4959 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497205 4959 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497211 4959 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497217 4959 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497223 4959 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497229 4959 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497235 4959 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497241 4959 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497246 4959 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497253 4959 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497259 4959 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497264 4959 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.497273 4959 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.497283 4959 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.499302 4959 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.504157 4959 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.504292 4959 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.505930 4959 server.go:997] "Starting client certificate rotation" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.505958 4959 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.506192 4959 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-18 07:29:28.905271839 +0000 UTC Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.506457 4959 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1002h28m40.398825409s for next certificate rotation Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.559564 4959 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.565099 4959 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.592246 4959 log.go:25] "Validated CRI v1 runtime API" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.663082 4959 log.go:25] "Validated CRI v1 image API" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.665153 4959 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.676554 4959 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-12-52-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.676667 4959 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.706593 4959 manager.go:217] Machine: {Timestamp:2025-10-07 13:00:48.702730704 +0000 UTC m=+0.863453401 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d0865fee-6f9e-434f-89c6-fcfcb332a933 BootID:b5a2585e-1f5c-45d6-b6d6-2e40f29e327a Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:72:3b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:72:3b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9c:74:51 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ac:ba:1a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:29:87:b5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1f:b3:81 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b6:7a:de Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:ea:a0:1f:0b:a2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a2:41:53:ad:2b:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.706843 4959 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.707046 4959 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.708942 4959 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.709166 4959 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.709205 4959 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.710353 4959 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.710370 4959 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.710984 4959 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.711012 4959 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.711154 4959 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.711513 4959 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.719220 4959 kubelet.go:418] "Attempting to sync node with API server" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.719269 4959 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.719303 4959 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.719325 4959 kubelet.go:324] "Adding apiserver pod source" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.719349 4959 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.726716 4959 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.728619 4959 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.729506 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.729585 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.729585 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.729651 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.729875 4959 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736230 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736261 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736268 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736275 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736291 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736298 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736305 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736316 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736323 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736329 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736341 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.736356 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.737673 4959 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.738086 4959 server.go:1280] "Started kubelet" Oct 07 13:00:48 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.744373 4959 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.745319 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.744372 4959 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.746911 4959 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.747741 4959 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.747815 4959 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.747878 4959 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 11:16:51.885392519 +0000 UTC Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.748055 4959 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1246h16m3.137344789s for next certificate rotation Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.754161 4959 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.754334 4959 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.754367 4959 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.754478 4959 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.754642 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="200ms" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.758193 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.758328 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.758454 4959 server.go:460] "Adding debug handlers to kubelet server" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.759944 4959 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.759968 4959 factory.go:55] Registering systemd factory Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.759976 4959 factory.go:221] Registration of the systemd container factory successfully Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.760902 4959 factory.go:153] Registering CRI-O factory Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.760918 4959 factory.go:221] Registration of the crio container factory successfully Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.760934 4959 factory.go:103] Registering Raw factory Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.760950 4959 manager.go:1196] Started watching for new ooms in manager Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.761600 4959 manager.go:319] Starting recovery of all containers Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.757940 4959 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.47:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c3704efd3e75f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 13:00:48.738060127 +0000 UTC m=+0.898782794,LastTimestamp:2025-10-07 13:00:48.738060127 +0000 UTC m=+0.898782794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763012 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763113 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763182 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763243 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763299 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763354 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763415 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763475 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763532 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763587 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763672 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763729 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763782 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763848 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763904 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.763987 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764044 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764098 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764160 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764218 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764273 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764334 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764394 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764459 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764534 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764593 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764691 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764769 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764829 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.764947 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765008 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765063 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765125 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765181 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765240 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765304 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765413 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765485 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765543 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765604 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765682 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765748 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765813 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765874 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.765939 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766000 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766086 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766336 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766485 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766581 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766691 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766761 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.766902 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767013 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767100 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767196 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767284 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767350 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767420 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767477 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767539 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767609 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767826 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767892 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.767948 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768007 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768088 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768149 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768211 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768307 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768389 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768485 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768565 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768665 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768760 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768821 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768898 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.768966 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769023 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769087 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769151 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769243 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769319 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769403 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769472 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769531 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769600 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769690 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769753 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769815 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769890 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.769972 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.770050 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.770112 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.770599 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.770729 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.770801 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.773254 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.774967 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775007 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775029 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775042 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775055 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775069 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.775107 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780792 4959 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780876 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780903 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780923 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780941 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780957 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780971 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.780988 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781003 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781020 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781033 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781045 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781057 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781069 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781081 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781093 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781107 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781122 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781137 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781169 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781182 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781197 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781209 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781221 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781248 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781262 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781274 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781286 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781298 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781310 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781325 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781336 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781391 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781406 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781418 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781431 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781446 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781460 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781473 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781485 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781497 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781510 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781522 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781534 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781546 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781558 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781581 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781604 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781620 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781656 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781668 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781680 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781694 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781708 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781721 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781732 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781744 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781757 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781770 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781782 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781795 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781806 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781818 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781832 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781844 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781856 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781868 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781879 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781892 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781905 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781916 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781929 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781941 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781953 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781964 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781976 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781988 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.781999 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782010 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782021 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782033 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782047 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782060 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782073 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782085 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782096 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782107 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782120 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782134 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782147 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782158 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782170 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782182 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782193 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782204 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782217 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782230 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782242 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782253 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782265 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782276 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782287 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782298 4959 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782309 4959 reconstruct.go:97] "Volume reconstruction finished" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782318 4959 reconciler.go:26] "Reconciler: start to sync state" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.782944 4959 manager.go:324] Recovery completed Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.792370 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.793957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.794002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.794017 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.794809 4959 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.794825 4959 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.794845 4959 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.805743 4959 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.807456 4959 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.807508 4959 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.807540 4959 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.807593 4959 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 13:00:48 crc kubenswrapper[4959]: W1007 13:00:48.808276 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.808344 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.815092 4959 policy_none.go:49] "None policy: Start" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.816306 4959 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.816383 4959 state_mem.go:35] "Initializing new in-memory state store" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.854288 4959 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.872959 4959 manager.go:334] "Starting Device Plugin manager" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873040 4959 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873055 4959 server.go:79] "Starting device plugin registration server" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873617 4959 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873670 4959 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873874 4959 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873977 4959 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.873987 4959 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.882223 4959 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.907797 4959 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.907903 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.910811 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.910854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.910865 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.911026 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.911202 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.911245 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.911995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912024 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912032 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912092 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912270 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912454 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.912503 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913273 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913308 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913324 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913418 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913614 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913651 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913668 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913683 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.913685 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914106 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914134 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914245 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914371 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914399 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.914941 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915040 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915237 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915819 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.915945 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.916557 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.916615 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.916653 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.918563 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.918596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.918608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.955780 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="400ms" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.973844 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.974613 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.974653 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.974663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.974685 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:48 crc kubenswrapper[4959]: E1007 13:00:48.974956 4959 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984566 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984597 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984617 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984644 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984660 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984677 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984691 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984735 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984758 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984777 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984795 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984814 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984847 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984873 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:48 crc kubenswrapper[4959]: I1007 13:00:48.984908 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085801 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085870 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085895 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085916 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085936 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085956 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085974 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.085996 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086020 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086041 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086063 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086055 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086058 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086084 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086277 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086273 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086317 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086153 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086168 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086176 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086183 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086189 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086293 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086468 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086305 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086148 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086524 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086150 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.086125 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.175350 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.176492 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.176542 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.176555 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.176582 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.177153 4959 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.248938 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.254726 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.272684 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.279735 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.283534 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.310462 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d5930fec9ec9471abbe6288e4943dfec982ed57801054b7e88d35b4e6410ea72 WatchSource:0}: Error finding container d5930fec9ec9471abbe6288e4943dfec982ed57801054b7e88d35b4e6410ea72: Status 404 returned error can't find the container with id d5930fec9ec9471abbe6288e4943dfec982ed57801054b7e88d35b4e6410ea72 Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.314420 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c8bfa76ca09f86fa443614dba0d234244f6fa1ac37a79d98af0717d2d650f412 WatchSource:0}: Error finding container c8bfa76ca09f86fa443614dba0d234244f6fa1ac37a79d98af0717d2d650f412: Status 404 returned error can't find the container with id c8bfa76ca09f86fa443614dba0d234244f6fa1ac37a79d98af0717d2d650f412 Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.321022 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1a7760ec73508751666fff9ce45129c91c23e2f08ea3ed97c1051be5376c7e2e WatchSource:0}: Error finding container 1a7760ec73508751666fff9ce45129c91c23e2f08ea3ed97c1051be5376c7e2e: Status 404 returned error can't find the container with id 1a7760ec73508751666fff9ce45129c91c23e2f08ea3ed97c1051be5376c7e2e Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.323971 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-aa17ea4b971ca9dd24af88c92193d305f00c9e0d810bd5cbce7bb93cedebda74 WatchSource:0}: Error finding container aa17ea4b971ca9dd24af88c92193d305f00c9e0d810bd5cbce7bb93cedebda74: Status 404 returned error can't find the container with id aa17ea4b971ca9dd24af88c92193d305f00c9e0d810bd5cbce7bb93cedebda74 Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.356842 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="800ms" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.577810 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.579040 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.579075 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.579085 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.579107 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.579506 4959 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.624798 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.624861 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.649808 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.649852 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.746114 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.813150 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d5930fec9ec9471abbe6288e4943dfec982ed57801054b7e88d35b4e6410ea72"} Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.814128 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa17ea4b971ca9dd24af88c92193d305f00c9e0d810bd5cbce7bb93cedebda74"} Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.822541 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f16703bc1a11e5cd1fbb5803a61ad033a2c1bcd5790d353451eeedc7b459c75"} Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.824672 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1a7760ec73508751666fff9ce45129c91c23e2f08ea3ed97c1051be5376c7e2e"} Oct 07 13:00:49 crc kubenswrapper[4959]: I1007 13:00:49.825560 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c8bfa76ca09f86fa443614dba0d234244f6fa1ac37a79d98af0717d2d650f412"} Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.937725 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.937828 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:49 crc kubenswrapper[4959]: W1007 13:00:49.938090 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:49 crc kubenswrapper[4959]: E1007 13:00:49.938256 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:50 crc kubenswrapper[4959]: E1007 13:00:50.157565 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="1.6s" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.380619 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.381815 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.381860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.381871 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.381902 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:50 crc kubenswrapper[4959]: E1007 13:00:50.382334 4959 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.747039 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.831084 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2"} Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.831170 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.832668 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233"} Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.832921 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.833051 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.833087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.833097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.834209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.834251 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.834260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.835193 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534"} Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.836577 4959 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c" exitCode=0 Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.836666 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c"} Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.836767 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.837719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.837749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.837759 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.838548 4959 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d" exitCode=0 Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.838576 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d"} Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.838620 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.839648 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.839669 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:50 crc kubenswrapper[4959]: I1007 13:00:50.839678 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: W1007 13:00:51.472254 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:51 crc kubenswrapper[4959]: E1007 13:00:51.472353 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.747275 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:51 crc kubenswrapper[4959]: E1007 13:00:51.759613 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="3.2s" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.845743 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.845721 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6a4d10fee883718bf74d987b3c350647a4beb21675a8f68736e0921528fb11e9"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.847583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.847709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.847740 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.847973 4959 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2" exitCode=0 Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.848119 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.848156 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.849772 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.849848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.849876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.850857 4959 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233" exitCode=0 Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.850953 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.851007 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.852305 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.852340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.852360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.854143 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.854233 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.854235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.854390 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.854430 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.855949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.855977 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.856020 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.856045 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.855993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.856085 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.858464 4959 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90" exitCode=0 Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.858513 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90"} Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.858709 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.859894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.859927 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.859940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.982933 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.984615 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.984687 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.984726 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:51 crc kubenswrapper[4959]: I1007 13:00:51.984761 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:51 crc kubenswrapper[4959]: E1007 13:00:51.985507 4959 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Oct 07 13:00:52 crc kubenswrapper[4959]: W1007 13:00:52.335434 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:52 crc kubenswrapper[4959]: E1007 13:00:52.335511 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:52 crc kubenswrapper[4959]: W1007 13:00:52.605475 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:52 crc kubenswrapper[4959]: E1007 13:00:52.605576 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:52 crc kubenswrapper[4959]: W1007 13:00:52.739680 4959 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:52 crc kubenswrapper[4959]: E1007 13:00:52.739760 4959 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.746532 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.864949 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.865019 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.865036 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.865162 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.866385 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.866418 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.866432 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.868115 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.868146 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.868167 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.869736 4959 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90" exitCode=0 Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.869853 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870232 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870499 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90"} Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870607 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870817 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.870848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871325 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871420 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871465 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871488 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:52 crc kubenswrapper[4959]: I1007 13:00:52.871500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.746411 4959 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.881082 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb"} Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.881140 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270"} Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.881272 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.886100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.886136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.886150 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.890273 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.890697 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"312125d4d0c09a6a4ab482468a25a2450713e015859407828cd8a49d544c1483"} Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.890729 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1eb7fe8a7e924894ea24b92a12e9e43c9d46ed10e8d9155e9bc8eab59a87104"} Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.890754 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b14abe5ee99dd207caf5bd51d812cf73517636174bab67fd0cda461ad94be1cf"} Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.890770 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.891112 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.891133 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:53 crc kubenswrapper[4959]: I1007 13:00:53.891144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.503842 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.504060 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.505197 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.505243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.505258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.898692 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9909669435c6d1210b5023239566b7675267744eea23011e9c4236c72a247a0d"} Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.898757 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d418c7ef477647c1476381ce30eeffbf4a196db028153acc2199b9fb9649d3f1"} Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.898781 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.898881 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899061 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899501 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899930 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899946 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.899957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.900291 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.900309 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.900751 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.900792 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:54 crc kubenswrapper[4959]: I1007 13:00:54.900804 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.186217 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.188500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.188581 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.188597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.188664 4959 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.901426 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.901551 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903348 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903539 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903615 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:55 crc kubenswrapper[4959]: I1007 13:00:55.903682 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.126894 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.127136 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.128527 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.128562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.128576 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.131344 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.904476 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.906233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.906320 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:56 crc kubenswrapper[4959]: I1007 13:00:56.906349 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:57 crc kubenswrapper[4959]: I1007 13:00:57.418504 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 13:00:57 crc kubenswrapper[4959]: I1007 13:00:57.418758 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:57 crc kubenswrapper[4959]: I1007 13:00:57.420244 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:57 crc kubenswrapper[4959]: I1007 13:00:57.420283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:57 crc kubenswrapper[4959]: I1007 13:00:57.420297 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.654790 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.655070 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.656779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.656830 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.656844 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.849380 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:00:58 crc kubenswrapper[4959]: E1007 13:00:58.882472 4959 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.911798 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.913673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.913744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:00:58 crc kubenswrapper[4959]: I1007 13:00:58.913766 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.702782 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.703270 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.704886 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.704935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.704948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.707374 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.816522 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.919865 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.920818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.920857 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:01 crc kubenswrapper[4959]: I1007 13:01:01.920869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:02 crc kubenswrapper[4959]: I1007 13:01:02.923279 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:02 crc kubenswrapper[4959]: I1007 13:01:02.924743 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:02 crc kubenswrapper[4959]: I1007 13:01:02.924801 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:02 crc kubenswrapper[4959]: I1007 13:01:02.924820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.824669 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.825078 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.826759 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.826813 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.826828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.863927 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.925067 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.926172 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.926212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.926221 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:03 crc kubenswrapper[4959]: I1007 13:01:03.937905 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.322025 4959 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.322112 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.333834 4959 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.333904 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.817454 4959 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.817536 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.927033 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.928377 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.928434 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:04 crc kubenswrapper[4959]: I1007 13:01:04.928450 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.854130 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.854346 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.855689 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.855714 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.855727 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.861955 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:01:08 crc kubenswrapper[4959]: E1007 13:01:08.882652 4959 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.936514 4959 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.937948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.937997 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:08 crc kubenswrapper[4959]: I1007 13:01:08.938010 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.304586 4959 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.308147 4959 trace.go:236] Trace[1048736610]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:00:57.095) (total time: 12212ms): Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1048736610]: ---"Objects listed" error: 12212ms (13:01:09.307) Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1048736610]: [12.212144025s] [12.212144025s] END Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.308191 4959 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.311774 4959 trace.go:236] Trace[1619772891]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:00:56.174) (total time: 13137ms): Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1619772891]: ---"Objects listed" error: 13137ms (13:01:09.311) Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1619772891]: [13.137243122s] [13.137243122s] END Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.311808 4959 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.311862 4959 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.312078 4959 trace.go:236] Trace[103524532]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:00:58.337) (total time: 10974ms): Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[103524532]: ---"Objects listed" error: 10974ms (13:01:09.311) Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[103524532]: [10.974546891s] [10.974546891s] END Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.312124 4959 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.312267 4959 trace.go:236] Trace[1638466326]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:00:58.641) (total time: 10670ms): Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1638466326]: ---"Objects listed" error: 10670ms (13:01:09.312) Oct 07 13:01:09 crc kubenswrapper[4959]: Trace[1638466326]: [10.670688225s] [10.670688225s] END Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.312289 4959 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.321188 4959 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.321657 4959 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.323585 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.323649 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.323658 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.323681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.323693 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.368033 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.372567 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.372617 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.372647 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.372668 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.372689 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383044 4959 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58658->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383116 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58658->192.168.126.11:17697: read: connection reset by peer" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383222 4959 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53610->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383240 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53610->192.168.126.11:17697: read: connection reset by peer" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383502 4959 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.383525 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.394310 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.400170 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.400248 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.400268 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.400296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.400318 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.412237 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.417376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.417487 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.417559 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.417652 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.417711 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.426361 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.430654 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.430758 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.430845 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.430939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.431002 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.440286 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.440415 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.442369 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.442398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.442407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.442426 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.442437 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.545554 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.545603 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.545617 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.545664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.545679 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.648536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.648592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.648605 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.648643 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.648663 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.732901 4959 apiserver.go:52] "Watching apiserver" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.752673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.752719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.752737 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.752764 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.752774 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.765434 4959 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.765764 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-ln4wb","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.766122 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.766193 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.766262 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.766417 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.766454 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.766469 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.767109 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.767138 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.767221 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.767302 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.769084 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.769935 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.769940 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770098 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770112 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770349 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770424 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770772 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.770772 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.771673 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.772831 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.773922 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.787305 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.799521 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.809191 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.816820 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.824890 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.840244 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.855404 4959 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.856430 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.856494 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.856509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.856531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.856546 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:09Z","lastTransitionTime":"2025-10-07T13:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.857155 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.871104 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.884921 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914495 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914555 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914587 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914609 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914657 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914681 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914707 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914732 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914759 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914818 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914840 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914860 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914888 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914912 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914931 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914951 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.914972 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915025 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915015 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915047 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915111 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915133 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915154 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915174 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915194 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915216 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915241 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915268 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915290 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915313 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915338 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915358 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915409 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915433 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915456 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915479 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915522 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915548 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915572 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915593 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915615 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915700 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915729 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915782 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915803 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915227 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915835 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915859 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915251 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915424 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915883 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915906 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915930 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915957 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915982 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916009 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916033 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916054 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916078 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916102 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916126 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916306 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916338 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916363 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916391 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916413 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916436 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916458 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916481 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916506 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916531 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916555 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916580 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916603 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916642 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916664 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916694 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916718 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916741 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916764 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916790 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916813 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916837 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916865 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916888 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916914 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916937 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916958 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916985 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917011 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917035 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917060 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917082 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917109 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917133 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917159 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917188 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917211 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917233 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917318 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917341 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917365 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917388 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917410 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917434 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917457 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917479 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917500 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917522 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917545 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917567 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917589 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917617 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917676 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917698 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917723 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915415 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917761 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915465 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915485 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915584 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915598 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915616 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915719 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915768 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915792 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915954 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.915994 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916041 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916202 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916209 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916197 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916319 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916492 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916534 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916566 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916760 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916772 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916809 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916823 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918003 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916937 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916966 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.916993 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917063 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917199 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917257 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917340 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917472 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917571 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918510 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918586 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918670 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918801 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.918871 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.919213 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.919251 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.920171 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.920210 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.920503 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.920839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.921551 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.921839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.921867 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922174 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922320 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922482 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922586 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922662 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.921928 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.922965 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.923933 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924074 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924099 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924264 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924381 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924399 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.917745 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924587 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924706 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924737 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924871 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924878 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.924741 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925143 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925130 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925180 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925222 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925255 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925444 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925496 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925533 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925601 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925657 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925735 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925771 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925809 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925847 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925986 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.925852 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926077 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926120 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.926154 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:10.426124457 +0000 UTC m=+22.586847134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926201 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926245 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926352 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926225 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926374 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926469 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926499 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926540 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926578 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926638 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926686 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926716 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926748 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926778 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926812 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926857 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926907 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926939 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926968 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.926998 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927018 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927032 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927078 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927090 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927121 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927160 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927191 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927221 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927255 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927281 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927310 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927341 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927367 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927387 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927230 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927412 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927482 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927511 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927372 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927550 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927596 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927667 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927700 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927728 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927760 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927833 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927863 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927896 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927934 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.927973 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928003 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928082 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928116 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928145 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928182 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928226 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928263 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928299 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928340 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928373 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928406 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928444 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928480 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928511 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928545 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928581 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928610 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929821 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929857 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929884 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929908 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929937 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.929964 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930058 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930098 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930134 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsc6z\" (UniqueName: \"kubernetes.io/projected/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-kube-api-access-hsc6z\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930167 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930219 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930247 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930277 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930308 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930335 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930372 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930399 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930426 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930456 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930481 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930507 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-hosts-file\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930536 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930683 4959 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930699 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930714 4959 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930733 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930755 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931320 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931345 4959 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931365 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931378 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931392 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931411 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931425 4959 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931438 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931452 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931469 4959 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931481 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931494 4959 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931508 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931526 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931541 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931567 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931592 4959 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931608 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931665 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931682 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931700 4959 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931714 4959 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932085 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932112 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932131 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932148 4959 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932162 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932181 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932194 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932208 4959 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932224 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932244 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932259 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932276 4959 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932311 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932328 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932342 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932355 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932368 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932386 4959 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932402 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932415 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932433 4959 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932446 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932467 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932479 4959 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932505 4959 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932517 4959 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932535 4959 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932556 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932574 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932587 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932601 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932619 4959 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932792 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932809 4959 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932821 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932838 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932850 4959 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932862 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932875 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932892 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932904 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932918 4959 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932931 4959 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932949 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932962 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932975 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932994 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933007 4959 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933021 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933035 4959 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933052 4959 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933073 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.953600 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928403 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.956939 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.957309 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.959538 4959 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.960286 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.940531 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928454 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928604 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.928718 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930027 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.930302 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.931304 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932215 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932354 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.932824 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.933319 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.935090 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.936764 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.937467 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.937838 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.938107 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.938933 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939221 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939332 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939539 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939551 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939597 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939916 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.939984 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.961019 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.961506 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.940027 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.940224 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.940339 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.940416 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.941110 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.942145 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.942489 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.942854 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.942889 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.943403 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.943835 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.944197 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.945181 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.947201 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.948569 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.948721 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.948903 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.949545 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.949985 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.950186 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.950212 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.950227 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.950288 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.950823 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.951200 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.951304 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.951536 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.951552 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.951786 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.952172 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.952327 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.952874 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.962724 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:10.462600842 +0000 UTC m=+22.623323519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.953102 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.953157 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.953295 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.953378 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.953553 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.954260 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.954374 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.954566 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.954994 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.955519 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.955397 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.955872 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.956334 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.956323 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.957713 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.958099 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.959294 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.959832 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.963772 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.960176 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.960344 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.964261 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.964422 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.964606 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.964666 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: E1007 13:01:09.965015 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:10.464967458 +0000 UTC m=+22.625690145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.966245 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.967426 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.967882 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.968240 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969012 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969096 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969510 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969609 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969788 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969821 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969922 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.969642 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.970321 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.972933 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:09 crc kubenswrapper[4959]: I1007 13:01:09.990826 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.004362 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.004863 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.005064 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.005228 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.005764 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.006071 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.011081 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.011324 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.011937 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012043 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012240 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012271 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012733 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012771 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012788 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012832 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.012901 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.014034 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.016023 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034265 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsc6z\" (UniqueName: \"kubernetes.io/projected/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-kube-api-access-hsc6z\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034349 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034406 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-hosts-file\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034456 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034584 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034603 4959 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034640 4959 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034655 4959 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034665 4959 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.034846 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.035264 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035580 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035612 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.035667 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-hosts-file\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035678 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035750 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035794 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035811 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:10.535782827 +0000 UTC m=+22.696505504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035824 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.035881 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:10.53586727 +0000 UTC m=+22.696589947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.053637 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.054004 4959 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb" exitCode=255 Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.054045 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.054224 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.055364 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.065281 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.069874 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.070442 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsc6z\" (UniqueName: \"kubernetes.io/projected/072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0-kube-api-access-hsc6z\") pod \"node-resolver-ln4wb\" (UID: \"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\") " pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071370 4959 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071438 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071459 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071475 4959 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071506 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071517 4959 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071526 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071536 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071545 4959 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071554 4959 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071605 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071618 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071651 4959 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071662 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071671 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071680 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071689 4959 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071698 4959 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071723 4959 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071733 4959 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071749 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071758 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071770 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071780 4959 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071804 4959 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071814 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071826 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071839 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071854 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071889 4959 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071901 4959 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071911 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071921 4959 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071930 4959 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071939 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071966 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071975 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071985 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.071999 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072008 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072018 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072043 4959 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072054 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072062 4959 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072072 4959 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072082 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072092 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072116 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072125 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072139 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072149 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072161 4959 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072170 4959 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072196 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072207 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072217 4959 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072235 4959 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072245 4959 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.072254 4959 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074745 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074756 4959 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074767 4959 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074776 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074699 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074829 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074838 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074846 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074856 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074881 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074891 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074900 4959 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074910 4959 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074920 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074929 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074953 4959 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074965 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074976 4959 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074990 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.074999 4959 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075008 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075031 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075044 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075057 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075069 4959 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075082 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075112 4959 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075125 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075138 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075149 4959 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075161 4959 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075171 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075196 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075207 4959 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075216 4959 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075226 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075236 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075245 4959 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075269 4959 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075278 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075288 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075297 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075306 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075315 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075324 4959 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075351 4959 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075363 4959 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.075375 4959 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.081009 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.087692 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-b2pc7"] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.088044 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dgmtp"] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.088318 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.088673 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.088787 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.089513 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.092254 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.092800 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.092857 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093129 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093257 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093312 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.092819 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093459 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093506 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093602 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.093785 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ln4wb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.098966 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.102044 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.108140 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-w62d8"] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.108825 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.114209 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.114537 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.130280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.130326 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.130338 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.130358 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.130372 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.132893 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.137568 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.137985 4959 scope.go:117] "RemoveContainer" containerID="5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.155033 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.174779 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175755 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-bin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175811 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-k8s-cni-cncf-io\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175845 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-conf-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175893 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-multus-certs\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175914 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-system-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175939 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-os-release\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.175984 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-netns\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176002 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-hostroot\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176035 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-cnibin\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176055 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz4bt\" (UniqueName: \"kubernetes.io/projected/4cbefab5-1f50-4f44-9163-479625fa11a4-kube-api-access-gz4bt\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176086 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-socket-dir-parent\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176120 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176149 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cbefab5-1f50-4f44-9163-479625fa11a4-mcd-auth-proxy-config\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176206 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-multus\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176232 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-kubelet\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176251 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cnibin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176281 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-binary-copy\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176298 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176313 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-etc-kubernetes\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176351 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cbefab5-1f50-4f44-9163-479625fa11a4-proxy-tls\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176374 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-os-release\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176392 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gll64\" (UniqueName: \"kubernetes.io/projected/39c3422b-6d08-4084-835f-3c6eeb42e474-kube-api-access-gll64\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176407 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4cbefab5-1f50-4f44-9163-479625fa11a4-rootfs\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176439 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfw2r\" (UniqueName: \"kubernetes.io/projected/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-kube-api-access-pfw2r\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176458 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-daemon-config\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176481 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-system-cni-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176510 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176537 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cni-binary-copy\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176578 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176590 4959 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176601 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.176612 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.193201 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.206062 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.231213 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.233279 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.233319 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.233328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.233343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.233354 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.258976 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.274345 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277286 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cni-binary-copy\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277339 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-bin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277356 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-k8s-cni-cncf-io\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277374 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-conf-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277417 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-system-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-os-release\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277449 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-netns\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277467 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-hostroot\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277489 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-multus-certs\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277487 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-k8s-cni-cncf-io\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277506 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-cnibin\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277495 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-bin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277526 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz4bt\" (UniqueName: \"kubernetes.io/projected/4cbefab5-1f50-4f44-9163-479625fa11a4-kube-api-access-gz4bt\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277555 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-netns\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277563 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-socket-dir-parent\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277792 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277825 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-cnibin\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277836 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cbefab5-1f50-4f44-9163-479625fa11a4-mcd-auth-proxy-config\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277854 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-hostroot\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277860 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-multus\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277878 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-run-multus-certs\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277886 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-kubelet\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277894 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277913 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277917 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-os-release\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277936 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-socket-dir-parent\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277837 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-system-cni-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277795 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-conf-dir\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.277933 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cnibin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278017 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cnibin\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278047 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-binary-copy\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278066 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-cni-multus\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278083 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-etc-kubernetes\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278101 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-host-var-lib-kubelet\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278131 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cbefab5-1f50-4f44-9163-479625fa11a4-proxy-tls\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278163 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4cbefab5-1f50-4f44-9163-479625fa11a4-rootfs\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278188 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-os-release\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278213 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gll64\" (UniqueName: \"kubernetes.io/projected/39c3422b-6d08-4084-835f-3c6eeb42e474-kube-api-access-gll64\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278251 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfw2r\" (UniqueName: \"kubernetes.io/projected/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-kube-api-access-pfw2r\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278277 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-system-cni-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278304 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278330 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-daemon-config\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278986 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.278405 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-cni-binary-copy\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.280079 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cbefab5-1f50-4f44-9163-479625fa11a4-mcd-auth-proxy-config\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.280779 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4cbefab5-1f50-4f44-9163-479625fa11a4-rootfs\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.280871 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-os-release\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.280909 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39c3422b-6d08-4084-835f-3c6eeb42e474-system-cni-dir\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.281106 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-etc-kubernetes\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.281585 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.282093 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-multus-daemon-config\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.282432 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39c3422b-6d08-4084-835f-3c6eeb42e474-cni-binary-copy\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.286262 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cbefab5-1f50-4f44-9163-479625fa11a4-proxy-tls\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.297603 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.300067 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfw2r\" (UniqueName: \"kubernetes.io/projected/07e132b2-5c1c-488e-abf4-bdaf3fcf4f93-kube-api-access-pfw2r\") pod \"multus-b2pc7\" (UID: \"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\") " pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.300557 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz4bt\" (UniqueName: \"kubernetes.io/projected/4cbefab5-1f50-4f44-9163-479625fa11a4-kube-api-access-gz4bt\") pod \"machine-config-daemon-dgmtp\" (UID: \"4cbefab5-1f50-4f44-9163-479625fa11a4\") " pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.303549 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gll64\" (UniqueName: \"kubernetes.io/projected/39c3422b-6d08-4084-835f-3c6eeb42e474-kube-api-access-gll64\") pod \"multus-additional-cni-plugins-w62d8\" (UID: \"39c3422b-6d08-4084-835f-3c6eeb42e474\") " pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.316803 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.326947 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.337042 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.337090 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.337102 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.337118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.337130 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.339201 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.352763 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.362355 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.371740 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.382056 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.390588 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.415263 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.425916 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b2pc7" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.432224 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w62d8" Oct 07 13:01:10 crc kubenswrapper[4959]: W1007 13:01:10.435851 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbefab5_1f50_4f44_9163_479625fa11a4.slice/crio-58b16c71f758dd8bee537432115093da6a8eeec042ae73ac65257ff936307185 WatchSource:0}: Error finding container 58b16c71f758dd8bee537432115093da6a8eeec042ae73ac65257ff936307185: Status 404 returned error can't find the container with id 58b16c71f758dd8bee537432115093da6a8eeec042ae73ac65257ff936307185 Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.440184 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.440217 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.440228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.440249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.440264 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.456924 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfm8k"] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.457770 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462165 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462301 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462337 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462385 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462464 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.462711 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.463047 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 13:01:10 crc kubenswrapper[4959]: W1007 13:01:10.476765 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e132b2_5c1c_488e_abf4_bdaf3fcf4f93.slice/crio-781b22b4fe22f862a6070a47d5623a122260061a9a28456bb85004f93ef4883a WatchSource:0}: Error finding container 781b22b4fe22f862a6070a47d5623a122260061a9a28456bb85004f93ef4883a: Status 404 returned error can't find the container with id 781b22b4fe22f862a6070a47d5623a122260061a9a28456bb85004f93ef4883a Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.477672 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.479861 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.479974 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480006 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480022 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480038 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480081 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480113 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480134 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480152 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480168 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tdh\" (UniqueName: \"kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480186 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480201 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480228 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480245 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480262 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480279 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480302 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480319 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480337 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480354 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480372 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.480398 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.480487 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:11.480465919 +0000 UTC m=+23.641188596 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.480673 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.480709 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:11.480700936 +0000 UTC m=+23.641423613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.480755 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.480779 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:11.480772508 +0000 UTC m=+23.641495185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.489562 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.511994 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.525172 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.542132 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.545963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.545996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.546009 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.546029 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.546042 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.557178 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.570221 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.579720 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581107 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581162 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581185 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581202 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581220 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581221 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581239 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581257 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581271 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581287 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581289 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581324 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581329 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581355 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581370 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581385 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581397 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581407 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581415 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581433 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581447 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77tdh\" (UniqueName: \"kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581473 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581492 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581508 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581524 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581546 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581582 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581609 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581676 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.581994 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582037 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582065 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582096 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582184 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582204 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582216 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582253 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:11.582241448 +0000 UTC m=+23.742964125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582318 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582360 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582386 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582409 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582418 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.582516 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582606 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582640 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582651 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: E1007 13:01:10.582698 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:11.58268899 +0000 UTC m=+23.743411767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.585933 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.588597 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.593543 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.607941 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.608216 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tdh\" (UniqueName: \"kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh\") pod \"ovnkube-node-jfm8k\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.626205 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.639125 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.649300 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.649376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.649389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.649406 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.649415 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.752356 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.752400 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.752415 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.752434 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.752447 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.806505 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.812233 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.812769 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.814025 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.814812 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.815769 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.816245 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.816864 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.820448 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.821295 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.822466 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.823319 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.825202 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.825883 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.826435 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.827505 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.828110 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.829141 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.829521 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.830102 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.831128 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.831643 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.832599 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.833143 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.834120 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.834546 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.835841 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.838561 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.839205 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.840582 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.841147 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.842074 4959 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.842172 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.843982 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.844444 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.845244 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.846883 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: W1007 13:01:10.847223 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb26fd9a1_4343_4f1c_bef3_764d3c74724a.slice/crio-bbd780aded1d877ecf4e1cd985d32ca3597d0c60c31c75f13ee41e23a00d0112 WatchSource:0}: Error finding container bbd780aded1d877ecf4e1cd985d32ca3597d0c60c31c75f13ee41e23a00d0112: Status 404 returned error can't find the container with id bbd780aded1d877ecf4e1cd985d32ca3597d0c60c31c75f13ee41e23a00d0112 Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.847545 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.848597 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.849303 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.850499 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.851181 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.853042 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.863142 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.864221 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.864988 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.864985 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.866961 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.868098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.868427 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.868449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.868461 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.868254 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.870327 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.871280 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.873466 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.875454 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.877822 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.878965 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.879900 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.971214 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.971249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.971258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.971272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:10 crc kubenswrapper[4959]: I1007 13:01:10.971282 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:10Z","lastTransitionTime":"2025-10-07T13:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.059365 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.059438 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.059454 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"58b16c71f758dd8bee537432115093da6a8eeec042ae73ac65257ff936307185"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.061269 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4d739315b0426d7a81aa9290ec7c2e0fff7213c18179bb1bbb965f1b2957ee2c"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.062922 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.062953 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.062964 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3a542306308bb030d83d6548090fd3b4cbbcf2b99d2bc4d059c7726a3290ed40"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.065200 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.065235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"aaa8eeb1dfe172169720654a2196a8fba820e606d6231ef5ec6f8b2dd41ebd93"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.068320 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.070792 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.071915 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.072486 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerStarted","Data":"db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.072548 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerStarted","Data":"781b22b4fe22f862a6070a47d5623a122260061a9a28456bb85004f93ef4883a"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.073221 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.073288 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.073300 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.073354 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.073370 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.074911 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.075380 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" exitCode=0 Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.075480 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.075556 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"bbd780aded1d877ecf4e1cd985d32ca3597d0c60c31c75f13ee41e23a00d0112"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.077266 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5" exitCode=0 Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.077363 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.077401 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerStarted","Data":"04db29cd698ea4a06281054936ae9d012034cee2f55a12bfffd9445010129ce9"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.084548 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ln4wb" event={"ID":"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0","Type":"ContainerStarted","Data":"e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.084599 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ln4wb" event={"ID":"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0","Type":"ContainerStarted","Data":"411fe99c766f3921e1ada652460e520756de28226e685ad788fe7978ae9d4bac"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.097791 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.112715 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.126589 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.142366 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.154742 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.170708 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.177483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.177541 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.177553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.177570 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.177582 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.196701 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.231218 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.244903 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.257445 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.274424 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.282285 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.282345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.282360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.282378 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.282391 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.289413 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.310747 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.339126 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.382580 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.384434 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.384460 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.384472 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.384490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.384503 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.419641 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.443509 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.459958 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.481143 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.486857 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.486890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.486899 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.486919 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.486929 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.492941 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.493117 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.493200 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.493215 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.493285 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:13.493240908 +0000 UTC m=+25.653963585 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.493336 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:13.4933135 +0000 UTC m=+25.654036177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.493380 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.493451 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:13.493432594 +0000 UTC m=+25.654155461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.504141 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.523270 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.546326 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.561247 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.589007 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.589045 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.589055 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.589074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.589085 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.594177 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.594241 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594375 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594419 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594432 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594375 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594516 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594530 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594497 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:13.594476912 +0000 UTC m=+25.755199589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.594607 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:13.594583505 +0000 UTC m=+25.755306182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.691531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.692270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.692283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.692323 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.692346 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.795557 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.795640 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.795656 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.795683 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.795700 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.807846 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.808028 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.808487 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.808561 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.808734 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:11 crc kubenswrapper[4959]: E1007 13:01:11.808973 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.822441 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.832930 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.835491 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.845729 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.860011 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.879398 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.898190 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.899418 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.899464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.899478 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.899500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.899852 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:11Z","lastTransitionTime":"2025-10-07T13:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.912641 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.926604 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.943077 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.961733 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.975375 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:11 crc kubenswrapper[4959]: I1007 13:01:11.990713 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.003357 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.003400 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.003411 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.003431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.003445 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.007475 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.027483 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.047023 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.061774 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.078015 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.091396 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.091437 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.091446 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.091455 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.091463 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.093612 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6" exitCode=0 Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.093643 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.106364 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.106406 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.106417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.106435 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.106448 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.107691 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.129907 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.149893 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.177146 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.192249 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.209844 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.209900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.209914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.209934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.209944 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.215917 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.241902 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.266599 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.287504 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.303401 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.314235 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.314278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.314289 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.314304 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.314315 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.319858 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.338612 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.353432 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.375441 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.400563 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.417328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.417379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.417393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.417413 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.417426 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.430699 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.462978 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.514830 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.536022 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.536074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.536084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.536101 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.536115 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.541853 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.561781 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.597172 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.638851 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.638892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.638903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.638921 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.638932 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.639782 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.683144 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.741327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.741364 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.741375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.741396 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.741410 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.845123 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.845565 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.845577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.845596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.845611 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.948724 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.949750 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.949777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.949799 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:12 crc kubenswrapper[4959]: I1007 13:01:12.949810 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:12Z","lastTransitionTime":"2025-10-07T13:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.052928 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.052972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.052980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.052996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.053006 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.101614 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.104893 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59" exitCode=0 Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.104998 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.106533 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.135023 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.152398 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.157091 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.157136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.157146 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.157165 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.157179 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.165372 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.178689 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.193220 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.209957 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.227112 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.244252 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.261330 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.261376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.261388 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.261405 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.261417 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.262070 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.274425 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.287845 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.301045 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.320833 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.336499 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.353597 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.364946 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.364986 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.364996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.365014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.365025 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.368821 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.387753 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.406379 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.436811 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.467503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.467554 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.467566 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.467586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.467598 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.477565 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.518084 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.523348 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.523512 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.523574 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:17.52352764 +0000 UTC m=+29.684250477 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.523649 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.523716 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.523793 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:17.523768647 +0000 UTC m=+29.684491524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.523943 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.524072 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:17.524043094 +0000 UTC m=+29.684765971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.559926 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.570194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.570231 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.570240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.570257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.570267 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.597191 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.624982 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.625032 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625188 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625221 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625235 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625293 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:17.625274688 +0000 UTC m=+29.785997365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625352 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625385 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625400 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.625476 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:17.625454553 +0000 UTC m=+29.786177230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.634876 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.673191 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.673247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.673259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.673279 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.673290 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.680872 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.719987 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.776242 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.776303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.776321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.776356 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.776376 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.808677 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.808810 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.809014 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.809161 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.809400 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:13 crc kubenswrapper[4959]: E1007 13:01:13.809791 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.880381 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.880429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.880444 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.880468 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.880487 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.985290 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.985365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.985389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.985418 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:13 crc kubenswrapper[4959]: I1007 13:01:13.985438 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:13Z","lastTransitionTime":"2025-10-07T13:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.090074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.090152 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.090172 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.090202 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.090238 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.114565 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4" exitCode=0 Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.114697 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.128861 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.155095 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.175424 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.193975 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.193998 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.194011 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.194030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.194046 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.202750 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.226449 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.244262 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.260157 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.295358 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.305187 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.305236 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.305251 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.305274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.305291 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.339109 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.368935 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.388927 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.404372 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.409868 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.409918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.409935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.409957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.409974 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.416585 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.512692 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.512778 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.512800 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.512829 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.512849 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.616510 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.616549 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.616562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.616578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.616588 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.719559 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.719659 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.719681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.719713 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.719732 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.822672 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.823208 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.823220 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.823240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.823255 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.926438 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.926481 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.926493 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.926509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:14 crc kubenswrapper[4959]: I1007 13:01:14.926520 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:14Z","lastTransitionTime":"2025-10-07T13:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.029437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.029486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.029501 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.029520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.029533 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.128587 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.131710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.131760 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.131779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.131802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.131822 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.138049 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359" exitCode=0 Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.138123 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.162874 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.188176 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.206708 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.224035 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.234545 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.234602 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.234614 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.234657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.234670 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.240649 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.257827 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.283587 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.309076 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.323313 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.335719 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.337389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.337874 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.337991 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.338113 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.338239 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.349559 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.364851 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.380999 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.441234 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.441501 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.441611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.441732 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.441830 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.544776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.544820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.544834 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.544854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.544867 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.648362 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.648487 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.648514 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.648600 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.648684 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.752462 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.752531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.752550 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.752578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.752603 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.808289 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.808809 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.809131 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:15 crc kubenswrapper[4959]: E1007 13:01:15.809243 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:15 crc kubenswrapper[4959]: E1007 13:01:15.809133 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:15 crc kubenswrapper[4959]: E1007 13:01:15.809670 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.856034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.856360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.856504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.856738 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.856916 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.960509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.960596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.960656 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.960693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:15 crc kubenswrapper[4959]: I1007 13:01:15.960716 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:15Z","lastTransitionTime":"2025-10-07T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.063561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.063595 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.063603 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.063620 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.063650 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.147516 4959 generic.go:334] "Generic (PLEG): container finished" podID="39c3422b-6d08-4084-835f-3c6eeb42e474" containerID="15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803" exitCode=0 Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.147575 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerDied","Data":"15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.167706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.168303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.168507 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.168730 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.168923 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.169729 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.193386 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.209089 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.226774 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.247878 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.267861 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.272371 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.272409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.272423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.272784 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.272827 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.297520 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.314055 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.334318 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.348129 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.360329 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.373097 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.375892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.375935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.375956 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.375977 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.375993 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.388756 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:16Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.479376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.479680 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.479777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.479872 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.479959 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.586984 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.587046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.587061 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.587083 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.587101 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.691140 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.691542 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.691961 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.692270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.692563 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.795866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.795914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.795938 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.795970 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.795995 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.899254 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.899338 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.899360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.899396 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:16 crc kubenswrapper[4959]: I1007 13:01:16.899418 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:16Z","lastTransitionTime":"2025-10-07T13:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.001821 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.001890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.001905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.001931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.001947 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.109066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.109450 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.109461 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.109479 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.109493 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.157537 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.158005 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.163098 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" event={"ID":"39c3422b-6d08-4084-835f-3c6eeb42e474","Type":"ContainerStarted","Data":"eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.175581 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.184487 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.187863 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.198534 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.210445 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.211673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.211698 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.211708 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.211731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.211741 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.223124 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.234142 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.247642 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.260244 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.273332 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.288122 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.301527 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.314665 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.314700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.314710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.314730 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.314741 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.325586 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.339058 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.350863 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.362874 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.381180 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.399969 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.414215 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.416721 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.416754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.416767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.416786 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.416796 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.427238 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.446223 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.458199 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.470746 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.482261 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.496473 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.509420 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.522756 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.522838 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.522866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.522898 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.522924 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.524324 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.571201 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.571501 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.571472128 +0000 UTC m=+37.732194845 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.571564 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.571601 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.571724 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.571786 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.571775206 +0000 UTC m=+37.732497893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.571793 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.571883 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.571828578 +0000 UTC m=+37.732551265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.625949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.626018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.626036 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.626058 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.626071 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.673011 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.673076 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673235 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673256 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673258 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673314 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673329 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673397 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.67337383 +0000 UTC m=+37.834096497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673269 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.673478 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.673460483 +0000 UTC m=+37.834183160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.729422 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.729485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.729504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.729530 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.729550 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.808399 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.808495 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.808399 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.808703 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.808837 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:17 crc kubenswrapper[4959]: E1007 13:01:17.808916 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.834839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.834912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.834937 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.834973 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.834999 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.938283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.938346 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.938362 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.938388 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:17 crc kubenswrapper[4959]: I1007 13:01:17.938402 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:17Z","lastTransitionTime":"2025-10-07T13:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.041904 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.041986 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.042016 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.042070 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.042095 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.144960 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.145026 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.145038 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.145062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.145079 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.167270 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.167702 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.198597 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.214465 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.233094 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.247640 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.247696 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.247709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.247729 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.248181 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.249036 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.264680 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.283467 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.296644 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.308076 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.326723 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.342878 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.351646 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.351681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.351690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.351704 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.351716 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.366793 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.383685 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.396867 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.413065 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.454583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.454656 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.454669 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.454690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.454703 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.558790 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.558829 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.558842 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.558860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.558876 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.681505 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.681566 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.681582 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.681606 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.681662 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.784277 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.784588 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.784597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.784613 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.784639 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.828788 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.851004 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.864348 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.885061 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.887012 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.887046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.887057 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.887076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.887088 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.907530 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.923202 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.944766 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.966377 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.989572 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.989647 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.989662 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.989682 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.989695 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:18Z","lastTransitionTime":"2025-10-07T13:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:18 crc kubenswrapper[4959]: I1007 13:01:18.997368 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.013512 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.030747 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.051085 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.070264 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.093039 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.093090 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.093099 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.093116 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.093127 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.170226 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.196044 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.196097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.196109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.196125 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.196136 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.299106 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.299148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.299156 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.299173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.299187 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.402969 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.403034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.403046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.403070 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.403083 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.506126 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.506178 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.506189 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.506205 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.506217 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.608596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.608663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.608673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.608691 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.608700 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.711415 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.711486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.711502 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.711527 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.711544 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.808296 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.808382 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.808296 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.808529 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.808655 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.808861 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.814114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.814163 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.814174 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.814193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.814208 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.842742 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.842795 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.842809 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.842830 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.842842 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.863607 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.869029 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.869115 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.869142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.869183 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.869211 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.892227 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.897548 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.897602 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.897617 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.897653 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.897667 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.913889 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.919240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.919312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.919329 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.919359 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.919376 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.936512 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.941173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.941233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.941253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.941277 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.941293 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.957529 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:19 crc kubenswrapper[4959]: E1007 13:01:19.957734 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.959686 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.959726 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.959737 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.959757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:19 crc kubenswrapper[4959]: I1007 13:01:19.959771 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:19Z","lastTransitionTime":"2025-10-07T13:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.062996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.063056 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.063067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.063088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.063098 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.165810 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.165851 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.165861 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.165877 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.165887 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.172824 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.268107 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.268181 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.268196 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.268220 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.268237 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.370513 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.370546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.370554 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.370572 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.370583 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.474530 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.474578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.474589 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.474610 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.474636 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.578527 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.578579 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.578594 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.578657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.578672 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.682338 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.682375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.682383 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.682398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.682409 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.786130 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.786182 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.786193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.786211 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.786224 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.890193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.890253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.890267 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.890295 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.890308 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.992753 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.992796 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.992805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.992824 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:20 crc kubenswrapper[4959]: I1007 13:01:20.992835 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:20Z","lastTransitionTime":"2025-10-07T13:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.095019 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.095060 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.095069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.095084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.095095 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.177309 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/0.log" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.180123 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90" exitCode=1 Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.180231 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.181137 4959 scope.go:117] "RemoveContainer" containerID="a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.200801 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.200846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.200859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.200877 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.200890 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.204095 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.226824 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.248207 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.271107 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.290666 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.303042 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.303075 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.303084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.303099 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.303109 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.310572 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.330277 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.345530 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.358256 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.371582 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.393438 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.406132 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.406193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.406209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.406574 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.406613 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.423789 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.437764 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.510508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.510559 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.510575 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.510602 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.510616 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.613054 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.613096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.613105 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.613120 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.613130 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.715719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.715770 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.715785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.715805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.715818 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.807926 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.807986 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.807944 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:21 crc kubenswrapper[4959]: E1007 13:01:21.808144 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:21 crc kubenswrapper[4959]: E1007 13:01:21.808249 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:21 crc kubenswrapper[4959]: E1007 13:01:21.808315 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.818610 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.818656 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.818694 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.818710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.818723 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.921833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.921874 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.921884 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.921902 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:21 crc kubenswrapper[4959]: I1007 13:01:21.921913 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:21Z","lastTransitionTime":"2025-10-07T13:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.024508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.024558 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.024568 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.024586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.024598 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.127537 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.127608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.127649 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.127673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.127690 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.186393 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/0.log" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.189941 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.190150 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.204821 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.219983 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.229878 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.229922 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.229932 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.229951 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.229967 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.236542 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.252302 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.275186 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.295024 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.311320 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.331467 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.333324 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.333356 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.333367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.333384 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.333396 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.348952 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.369227 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.388398 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.407410 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.428920 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.436535 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.436609 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.436646 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.436673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.436689 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.540208 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.540313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.540345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.540381 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.540404 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.573021 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.644180 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.644236 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.644249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.644274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.644289 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.720887 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.741378 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.747553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.747617 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.747664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.747693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.747719 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.756889 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.778922 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.800562 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.830163 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.850983 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.851104 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.851135 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.851170 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.851197 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.858661 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.886246 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.909848 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.932819 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.949382 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.955435 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.955492 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.955505 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.955531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.955543 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:22Z","lastTransitionTime":"2025-10-07T13:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.977408 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:22 crc kubenswrapper[4959]: I1007 13:01:22.991881 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.007554 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.057784 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.057854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.057870 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.057893 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.057907 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.161164 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.161275 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.161370 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.161417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.161448 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.195333 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5"] Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.197558 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.198271 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/1.log" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.200495 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/0.log" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.201533 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.202829 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.211473 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528" exitCode=1 Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.211557 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.211741 4959 scope.go:117] "RemoveContainer" containerID="a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.214355 4959 scope.go:117] "RemoveContainer" containerID="79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528" Oct 07 13:01:23 crc kubenswrapper[4959]: E1007 13:01:23.215001 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.221662 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.241120 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.256088 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.265166 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.265236 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.265251 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.265269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.265691 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.273118 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.287943 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.304877 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.321727 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.340647 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68f2\" (UniqueName: \"kubernetes.io/projected/998ce932-909a-4460-868b-149812f4c695-kube-api-access-f68f2\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.340744 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ce932-909a-4460-868b-149812f4c695-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.340822 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.340897 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.344473 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.367433 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.369199 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.369286 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.369301 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.370380 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.370698 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.390725 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.406588 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.419459 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.434873 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.442804 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.442952 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68f2\" (UniqueName: \"kubernetes.io/projected/998ce932-909a-4460-868b-149812f4c695-kube-api-access-f68f2\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.443019 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ce932-909a-4460-868b-149812f4c695-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.443075 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.443689 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.444264 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/998ce932-909a-4460-868b-149812f4c695-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.452373 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ce932-909a-4460-868b-149812f4c695-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.453165 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.459837 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68f2\" (UniqueName: \"kubernetes.io/projected/998ce932-909a-4460-868b-149812f4c695-kube-api-access-f68f2\") pod \"ovnkube-control-plane-749d76644c-4tbb5\" (UID: \"998ce932-909a-4460-868b-149812f4c695\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.477815 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.477859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.477874 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.477899 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.477914 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.480814 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.499753 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.517649 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.530003 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.530848 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.552443 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: W1007 13:01:23.556882 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998ce932_909a_4460_868b_149812f4c695.slice/crio-260a6c6c2adf5ecd5c5d68559c56106377dae31b9f41c2725defe64ee92c5a9e WatchSource:0}: Error finding container 260a6c6c2adf5ecd5c5d68559c56106377dae31b9f41c2725defe64ee92c5a9e: Status 404 returned error can't find the container with id 260a6c6c2adf5ecd5c5d68559c56106377dae31b9f41c2725defe64ee92c5a9e Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.572142 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.581927 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.581998 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.582013 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.582035 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.582274 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.592124 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.614018 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.640385 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.664008 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.682365 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.685114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.685171 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.685190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.685215 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.685234 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.700834 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.721969 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.742027 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:23Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.788602 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.788688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.788703 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.788747 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.788761 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.807780 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.807896 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.808003 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:23 crc kubenswrapper[4959]: E1007 13:01:23.808115 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:23 crc kubenswrapper[4959]: E1007 13:01:23.808293 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:23 crc kubenswrapper[4959]: E1007 13:01:23.808411 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.890942 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.890980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.890993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.891012 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.891024 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.994148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.994200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.994217 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.994243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:23 crc kubenswrapper[4959]: I1007 13:01:23.994262 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:23Z","lastTransitionTime":"2025-10-07T13:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.097387 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.097432 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.097444 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.097463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.097480 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.201137 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.201207 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.201228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.201258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.201280 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.217782 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" event={"ID":"998ce932-909a-4460-868b-149812f4c695","Type":"ContainerStarted","Data":"55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.217915 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" event={"ID":"998ce932-909a-4460-868b-149812f4c695","Type":"ContainerStarted","Data":"d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.217994 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" event={"ID":"998ce932-909a-4460-868b-149812f4c695","Type":"ContainerStarted","Data":"260a6c6c2adf5ecd5c5d68559c56106377dae31b9f41c2725defe64ee92c5a9e"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.220393 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/1.log" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.225512 4959 scope.go:117] "RemoveContainer" containerID="79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528" Oct 07 13:01:24 crc kubenswrapper[4959]: E1007 13:01:24.225799 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.236838 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.263195 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.285389 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.304765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.304832 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.304852 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.304885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.304904 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.308079 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.325738 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.339831 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.349141 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7xjp6"] Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.349989 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g57ch"] Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.350353 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.351366 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: E1007 13:01:24.351549 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.354060 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.356250 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.362488 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.363619 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.367854 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.380830 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.393398 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.407706 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.408193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.408258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.408280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.408307 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.408328 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.427271 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.445557 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.460202 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5bk\" (UniqueName: \"kubernetes.io/projected/ed03c94e-16fb-42f7-8383-ac7c2c403298-kube-api-access-pd5bk\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.460269 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.460300 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdrg\" (UniqueName: \"kubernetes.io/projected/024b990f-a1e8-4ebf-9d60-48afd626881d-kube-api-access-5zdrg\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.460357 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/024b990f-a1e8-4ebf-9d60-48afd626881d-host\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.460377 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/024b990f-a1e8-4ebf-9d60-48afd626881d-serviceca\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.479707 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90841563ab075ce8bcab4eec4c6d9861453f8dc5d83b77afc99664e6ce47b90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:20Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:20.441519 6207 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:01:20.441622 6207 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:20.441717 6207 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:01:20.441763 6207 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:01:20.441811 6207 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:20.441862 6207 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:01:20.441977 6207 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:20.442041 6207 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:01:20.442118 6207 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:01:20.442191 6207 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:01:20.442126 6207 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:01:20.442212 6207 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 13:01:20.442287 6207 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:20.442351 6207 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:01:20.442357 6207 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.494694 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516142 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516540 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516548 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516566 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.516577 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.534050 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.551191 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561636 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdrg\" (UniqueName: \"kubernetes.io/projected/024b990f-a1e8-4ebf-9d60-48afd626881d-kube-api-access-5zdrg\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561719 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/024b990f-a1e8-4ebf-9d60-48afd626881d-host\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561739 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/024b990f-a1e8-4ebf-9d60-48afd626881d-serviceca\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561771 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5bk\" (UniqueName: \"kubernetes.io/projected/ed03c94e-16fb-42f7-8383-ac7c2c403298-kube-api-access-pd5bk\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561791 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.561893 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/024b990f-a1e8-4ebf-9d60-48afd626881d-host\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: E1007 13:01:24.561906 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:24 crc kubenswrapper[4959]: E1007 13:01:24.562037 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:25.062009766 +0000 UTC m=+37.222732443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.564335 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/024b990f-a1e8-4ebf-9d60-48afd626881d-serviceca\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.568534 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.582056 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdrg\" (UniqueName: \"kubernetes.io/projected/024b990f-a1e8-4ebf-9d60-48afd626881d-kube-api-access-5zdrg\") pod \"node-ca-7xjp6\" (UID: \"024b990f-a1e8-4ebf-9d60-48afd626881d\") " pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.584919 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5bk\" (UniqueName: \"kubernetes.io/projected/ed03c94e-16fb-42f7-8383-ac7c2c403298-kube-api-access-pd5bk\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.625460 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.625526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.625546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.625578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.625602 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.630142 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.650272 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.675307 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xjp6" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.676406 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.695999 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: W1007 13:01:24.699060 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024b990f_a1e8_4ebf_9d60_48afd626881d.slice/crio-b75be5b177b833678e28b8afbdcfd02616fff1daf4d2338ffcbfea5bbbf13bb8 WatchSource:0}: Error finding container b75be5b177b833678e28b8afbdcfd02616fff1daf4d2338ffcbfea5bbbf13bb8: Status 404 returned error can't find the container with id b75be5b177b833678e28b8afbdcfd02616fff1daf4d2338ffcbfea5bbbf13bb8 Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.720010 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.728834 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.728881 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.728904 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.728929 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.728948 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.742370 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.756656 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.780274 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.806110 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.824194 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.833272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.833330 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.833344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.833367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.833382 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.843885 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.863351 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.936244 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.936298 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.936309 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.936327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:24 crc kubenswrapper[4959]: I1007 13:01:24.936341 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:24Z","lastTransitionTime":"2025-10-07T13:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.040573 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.040658 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.040676 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.040698 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.040715 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.068151 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.068335 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.068420 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:26.06838086 +0000 UTC m=+38.229103537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.144715 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.144770 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.144780 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.144801 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.144813 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.231124 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xjp6" event={"ID":"024b990f-a1e8-4ebf-9d60-48afd626881d","Type":"ContainerStarted","Data":"67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.231237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xjp6" event={"ID":"024b990f-a1e8-4ebf-9d60-48afd626881d","Type":"ContainerStarted","Data":"b75be5b177b833678e28b8afbdcfd02616fff1daf4d2338ffcbfea5bbbf13bb8"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.248097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.248173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.248199 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.248234 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.248274 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.255681 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.285513 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.300751 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.322955 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.337032 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.351583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.351663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.351675 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.351694 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.351706 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.357138 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.374955 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.387381 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.403474 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.419147 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.432070 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.454703 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.455030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.455142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.455172 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.455750 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.455887 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.467729 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.485271 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.502440 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.517001 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.560835 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.560910 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.560937 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.560971 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.560995 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.573911 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.574116 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:01:41.574075436 +0000 UTC m=+53.734798163 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.574181 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.574243 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.574433 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.574455 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.574523 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:41.574506108 +0000 UTC m=+53.735228815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.574561 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:41.574535289 +0000 UTC m=+53.735258186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.663606 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.663709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.663731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.663754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.663768 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.676076 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.676185 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676338 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676377 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676398 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676464 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:41.676437841 +0000 UTC m=+53.837160728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676338 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676510 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676528 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.676592 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:41.676576185 +0000 UTC m=+53.837299072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.767464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.767541 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.767562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.767591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.767612 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.808581 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.808855 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.808939 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.809146 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.809234 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.809301 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.809356 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:25 crc kubenswrapper[4959]: E1007 13:01:25.809407 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.871843 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.871914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.871935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.871965 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.871986 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.976016 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.976079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.976111 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.976135 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:25 crc kubenswrapper[4959]: I1007 13:01:25.976150 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:25Z","lastTransitionTime":"2025-10-07T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.079421 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.079477 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.079491 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.079511 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.079526 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.080091 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:26 crc kubenswrapper[4959]: E1007 13:01:26.080410 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:26 crc kubenswrapper[4959]: E1007 13:01:26.080885 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:28.080531872 +0000 UTC m=+40.241254739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.182813 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.182884 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.182903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.182931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.182951 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.286733 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.286823 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.286851 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.286884 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.286908 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.390743 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.390864 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.390885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.390917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.390941 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.494161 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.494231 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.494255 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.494284 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.494306 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.598509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.598616 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.598667 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.598696 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.598716 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.702148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.702871 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.702944 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.702992 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.703013 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.807261 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.807320 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.807337 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.807363 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.807381 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.910765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.910828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.910848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.910875 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:26 crc kubenswrapper[4959]: I1007 13:01:26.910894 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:26Z","lastTransitionTime":"2025-10-07T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.014699 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.014794 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.014833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.014865 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.014884 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.119018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.119080 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.119097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.119127 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.119145 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.222818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.222894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.222911 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.222939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.222960 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.327187 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.327247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.327265 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.327292 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.327309 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.430583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.430695 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.430721 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.430752 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.430774 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.533586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.533653 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.533670 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.533692 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.533705 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.637118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.637194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.637212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.637228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.637241 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.740266 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.740339 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.740361 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.740393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.740418 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.808482 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.808613 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.808509 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.808509 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:27 crc kubenswrapper[4959]: E1007 13:01:27.808828 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:27 crc kubenswrapper[4959]: E1007 13:01:27.808978 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:27 crc kubenswrapper[4959]: E1007 13:01:27.809172 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:27 crc kubenswrapper[4959]: E1007 13:01:27.809331 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.844071 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.844166 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.844192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.844229 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.844256 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.947449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.947516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.947534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.947561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:27 crc kubenswrapper[4959]: I1007 13:01:27.947582 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:27Z","lastTransitionTime":"2025-10-07T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.051078 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.051139 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.051158 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.051186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.051207 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.107770 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:28 crc kubenswrapper[4959]: E1007 13:01:28.108042 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:28 crc kubenswrapper[4959]: E1007 13:01:28.108165 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:32.108141328 +0000 UTC m=+44.268864005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.154063 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.154167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.154185 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.154204 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.154216 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.257543 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.257612 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.257665 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.257692 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.257714 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.360705 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.360815 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.360836 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.360867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.360889 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.464240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.464291 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.464310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.464334 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.464351 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.567220 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.567302 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.567322 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.567351 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.567371 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.670749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.670819 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.670839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.670868 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.670887 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.774912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.774966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.774985 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.775011 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.775028 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.832222 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.861924 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.878668 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.878780 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.878803 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.878829 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.878846 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.889061 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.911600 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.936983 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.973331 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.981432 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.981502 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.981520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.981547 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.981566 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:28Z","lastTransitionTime":"2025-10-07T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:28 crc kubenswrapper[4959]: I1007 13:01:28.995286 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.014189 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.031492 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.053869 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.074296 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.085530 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.085594 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.085618 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.085684 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.085709 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.092337 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.111035 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.127458 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.149380 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.168928 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.189258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.189347 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.189367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.189393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.189416 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.292621 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.292734 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.292757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.292784 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.292807 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.396232 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.396321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.396339 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.396368 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.396390 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.500085 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.500167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.500197 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.500230 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.500257 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.605776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.605871 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.605892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.605974 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.606000 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.709749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.709810 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.709825 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.709854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.709872 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.808384 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.808516 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.808572 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.808881 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:29 crc kubenswrapper[4959]: E1007 13:01:29.808884 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:29 crc kubenswrapper[4959]: E1007 13:01:29.809122 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:29 crc kubenswrapper[4959]: E1007 13:01:29.809270 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:29 crc kubenswrapper[4959]: E1007 13:01:29.809431 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.813862 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.813931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.814000 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.814028 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.814047 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.917314 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.917402 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.917425 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.917461 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:29 crc kubenswrapper[4959]: I1007 13:01:29.917556 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:29Z","lastTransitionTime":"2025-10-07T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.021407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.021511 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.021538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.021571 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.021594 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.126446 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.126516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.126526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.126553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.126572 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.136976 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.137028 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.137037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.137049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.137057 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.162527 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.169640 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.169684 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.169695 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.169718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.169732 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.187405 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.192332 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.192407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.192425 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.192456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.192476 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.210473 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.215776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.215835 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.215854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.215875 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.215894 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.232826 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.238108 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.238165 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.238179 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.238206 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.238222 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.255261 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:30 crc kubenswrapper[4959]: E1007 13:01:30.255508 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.257777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.257895 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.257924 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.257956 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.257979 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.361623 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.361744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.361763 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.361793 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.361816 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.465156 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.465225 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.465247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.465278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.465302 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.568770 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.568843 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.568862 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.568891 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.568917 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.673184 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.673257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.673270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.673296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.673311 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.777199 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.777270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.777288 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.777312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.777330 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.881998 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.882071 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.882093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.882123 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.882148 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.986767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.986848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.986867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.986899 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:30 crc kubenswrapper[4959]: I1007 13:01:30.986952 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:30Z","lastTransitionTime":"2025-10-07T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.090077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.090192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.090226 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.090285 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.090311 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.194573 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.194694 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.194723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.194760 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.194784 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.299075 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.299153 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.299169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.299190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.299207 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.406652 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.406749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.406800 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.406843 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.406870 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.513225 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.513294 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.513312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.513340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.513359 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.617267 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.617386 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.617413 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.617439 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.617457 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.721948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.722019 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.722038 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.722069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.722092 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.807935 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:31 crc kubenswrapper[4959]: E1007 13:01:31.808187 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.808558 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.808620 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.808676 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:31 crc kubenswrapper[4959]: E1007 13:01:31.808824 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:31 crc kubenswrapper[4959]: E1007 13:01:31.809199 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:31 crc kubenswrapper[4959]: E1007 13:01:31.809513 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.826001 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.826049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.826070 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.826095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.826118 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.930712 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.930800 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.930824 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.930858 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:31 crc kubenswrapper[4959]: I1007 13:01:31.930879 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:31Z","lastTransitionTime":"2025-10-07T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.034900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.035132 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.035162 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.035201 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.035220 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.139879 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.139970 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.139990 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.140022 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.140044 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.157671 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:32 crc kubenswrapper[4959]: E1007 13:01:32.157906 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:32 crc kubenswrapper[4959]: E1007 13:01:32.158027 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:40.157989261 +0000 UTC m=+52.318711968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.243979 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.244034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.244050 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.244077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.244097 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.348404 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.348472 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.348491 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.348514 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.348532 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.452826 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.452924 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.452948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.452983 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.453009 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.555733 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.555828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.555853 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.555887 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.555911 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.660310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.660397 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.660420 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.660448 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.660469 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.763724 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.763798 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.763820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.763849 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.763870 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.866921 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.866990 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.867012 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.867050 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.867074 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.970454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.970531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.970559 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.970591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:32 crc kubenswrapper[4959]: I1007 13:01:32.970614 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:32Z","lastTransitionTime":"2025-10-07T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.074080 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.074132 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.074150 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.074176 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.074194 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.178202 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.178263 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.178281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.178307 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.178325 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.281674 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.281731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.281743 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.281793 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.281809 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.385878 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.385934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.385952 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.385980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.385999 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.490135 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.490202 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.490216 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.490237 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.490253 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.594724 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.594790 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.594802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.594825 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.594841 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.698370 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.699378 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.699679 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.699791 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.699848 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.804787 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.804844 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.804856 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.804878 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.804891 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.808386 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.808538 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:33 crc kubenswrapper[4959]: E1007 13:01:33.808771 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.808986 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:33 crc kubenswrapper[4959]: E1007 13:01:33.809000 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:33 crc kubenswrapper[4959]: E1007 13:01:33.809434 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.809540 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:33 crc kubenswrapper[4959]: E1007 13:01:33.809900 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.908583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.908674 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.908694 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.908723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:33 crc kubenswrapper[4959]: I1007 13:01:33.908743 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:33Z","lastTransitionTime":"2025-10-07T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.011853 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.011903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.011915 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.011934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.011948 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.115552 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.115663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.115687 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.115711 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.115732 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.219093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.219153 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.219167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.219189 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.219205 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.323102 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.323156 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.323174 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.323200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.323219 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.427128 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.427525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.427611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.427785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.427890 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.531939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.532209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.532272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.532348 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.532454 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.635837 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.635888 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.635898 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.635918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.635931 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.739409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.739750 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.739820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.739900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.739967 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.843224 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.843876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.844078 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.844296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.844484 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.947342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.947729 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.947883 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.948014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:34 crc kubenswrapper[4959]: I1007 13:01:34.948127 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:34Z","lastTransitionTime":"2025-10-07T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.051385 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.051805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.051926 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.052027 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.052107 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.155860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.156173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.156334 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.156462 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.156571 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.260687 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.260755 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.260774 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.260846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.260866 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.364352 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.364417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.364433 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.364454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.364468 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.468403 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.468860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.468891 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.468982 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.469020 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.572310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.572359 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.572372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.572393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.572407 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.675957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.676029 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.676055 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.676085 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.676167 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.788958 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.789046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.789069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.789100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.789124 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.808569 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.808609 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.808577 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:35 crc kubenswrapper[4959]: E1007 13:01:35.808835 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:35 crc kubenswrapper[4959]: E1007 13:01:35.808993 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:35 crc kubenswrapper[4959]: E1007 13:01:35.809112 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.809268 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:35 crc kubenswrapper[4959]: E1007 13:01:35.809598 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.892516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.892587 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.892609 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.892672 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.892699 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.995617 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.995706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.995723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.995749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:35 crc kubenswrapper[4959]: I1007 13:01:35.995767 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:35Z","lastTransitionTime":"2025-10-07T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.099046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.099103 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.099115 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.099140 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.099155 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.202432 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.202473 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.202488 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.202507 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.202520 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.304362 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.304443 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.304460 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.304488 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.304507 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.407491 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.407534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.407543 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.407560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.407570 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.511084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.511159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.511177 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.511206 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.511223 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.613885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.613940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.613951 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.613969 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.613983 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.716577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.716616 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.716653 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.716670 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.716684 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.819017 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.819056 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.819067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.819082 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.819091 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.923425 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.923475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.923485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.923503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:36 crc kubenswrapper[4959]: I1007 13:01:36.923514 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:36Z","lastTransitionTime":"2025-10-07T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.026718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.026793 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.026807 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.026827 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.026840 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.129534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.129663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.129690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.129719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.129747 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.232006 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.232048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.232059 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.232078 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.232101 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.334329 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.334383 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.334399 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.334424 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.334439 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.436375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.436401 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.436409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.436422 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.436432 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.539757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.539835 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.539850 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.539871 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.539914 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.642561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.642662 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.642675 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.642692 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.642704 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.745680 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.745748 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.745774 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.745805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.745824 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.808253 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:37 crc kubenswrapper[4959]: E1007 13:01:37.808498 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.808817 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.808845 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.808817 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:37 crc kubenswrapper[4959]: E1007 13:01:37.809410 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:37 crc kubenswrapper[4959]: E1007 13:01:37.809733 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:37 crc kubenswrapper[4959]: E1007 13:01:37.809986 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.849711 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.849785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.849805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.849830 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.849848 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.952959 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.953014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.953026 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.953053 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:37 crc kubenswrapper[4959]: I1007 13:01:37.953065 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:37Z","lastTransitionTime":"2025-10-07T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.056059 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.056101 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.056117 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.056138 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.056153 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.158743 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.159062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.159176 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.159297 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.159435 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.262239 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.262508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.262652 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.262802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.262913 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.365498 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.365585 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.365611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.365701 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.365728 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.469336 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.469403 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.469423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.469451 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.469471 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.572905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.572985 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.573022 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.573063 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.573092 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.677122 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.677181 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.677205 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.677241 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.677273 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.781371 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.781460 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.781486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.781517 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.781544 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.811471 4959 scope.go:117] "RemoveContainer" containerID="79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.829738 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.847944 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.861870 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.876804 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.884066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.884107 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.884119 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.884135 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.884149 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.891238 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.906796 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.918614 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.934843 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.950296 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.968498 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.980851 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.986798 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.986839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.986849 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.986867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.986879 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:38Z","lastTransitionTime":"2025-10-07T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:38 crc kubenswrapper[4959]: I1007 13:01:38.996596 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.012129 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.025538 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.039026 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.049173 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.089220 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.089271 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.089283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.089303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.089317 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.192198 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.192256 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.192274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.192298 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.192316 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.293659 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.293690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.293700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.293714 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.293723 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.294302 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/1.log" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.296486 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.296918 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.314017 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.326746 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.339506 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.353229 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.369210 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.379939 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.394303 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.396113 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.396144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.396153 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.396169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.396179 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.409611 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.429858 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.449179 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.462250 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.478616 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.493455 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.498467 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.498521 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.498538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.498561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.498580 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.509570 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.525840 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.553870 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.601260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.601327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.601341 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.601367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.601383 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.704350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.704392 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.704401 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.704418 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.704430 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.807204 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.807252 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.807260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.807275 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.807287 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.808172 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.808246 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.808294 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:39 crc kubenswrapper[4959]: E1007 13:01:39.808357 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.808264 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:39 crc kubenswrapper[4959]: E1007 13:01:39.808710 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:39 crc kubenswrapper[4959]: E1007 13:01:39.808880 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:39 crc kubenswrapper[4959]: E1007 13:01:39.809054 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.909531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.909583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.909594 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.909615 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:39 crc kubenswrapper[4959]: I1007 13:01:39.909648 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:39Z","lastTransitionTime":"2025-10-07T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.012673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.012722 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.012735 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.012752 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.012765 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.115439 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.115490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.115503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.115520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.115533 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.218535 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.218607 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.218642 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.218670 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.218687 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.252804 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.252994 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.253050 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:01:56.253036124 +0000 UTC m=+68.413758801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.302378 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/2.log" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.303551 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/1.log" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.306517 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" exitCode=1 Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.306578 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.306693 4959 scope.go:117] "RemoveContainer" containerID="79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.308073 4959 scope.go:117] "RemoveContainer" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.308429 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.321560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.321610 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.321640 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.321662 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.321674 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.327517 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.348514 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.365435 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.381493 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.401594 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.415803 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.425512 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.425573 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.425592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.425623 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.425669 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.435004 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.449854 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.456683 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.456753 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.456772 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.456802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.456820 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.465868 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.471043 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.475639 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.475682 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.475693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.475712 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.475729 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.486147 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.488797 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.493497 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.493552 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.493565 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.493589 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.493604 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.511484 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.514511 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79568926bc5c41cdabd9bd758a90b0b8766a80c8ceea787dc78144b582f82528\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"message\\\":\\\"71ms\\\\nI1007 13:01:22.285321 6356 services_controller.go:356] Processing sync for service openshift-authentication/oauth-openshift for network=default\\\\nI1007 13:01:22.285335 6356 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:22.285366 6356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1007 13:01:22.285372 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI1007 13:01:22.285397 6356 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 1.412819ms\\\\nI1007 13:01:22.285431 6356 services_controller.go:356] Processing sync for service openshift-image-registry/image-registry for network=default\\\\nI1007 13:01:22.285429 6356 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nF1007 13:01:22.285446 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.518390 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.518444 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.518461 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.518919 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.519292 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.533350 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.538240 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.543213 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.543248 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.543261 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.543282 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.543298 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.549359 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.557378 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: E1007 13:01:40.557563 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.560122 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.560175 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.560187 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.560207 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.560221 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.569407 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.581374 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.592607 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:40Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.663287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.663490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.663526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.663560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.663586 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.766534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.766608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.766661 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.766688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.766702 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.869014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.869080 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.869097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.869124 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.869145 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.972080 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.972144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.972162 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.972191 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:40 crc kubenswrapper[4959]: I1007 13:01:40.972209 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:40Z","lastTransitionTime":"2025-10-07T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.075596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.075670 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.075681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.075700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.075710 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.178562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.178597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.178607 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.178620 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.178654 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.281090 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.281126 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.281135 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.281153 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.281165 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.310351 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/2.log" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.312907 4959 scope.go:117] "RemoveContainer" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.313060 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.343062 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.360195 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.379144 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.383187 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.383465 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.383489 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.383521 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.383547 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.398470 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.418390 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.439176 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.451384 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.465564 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.479105 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.486731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.486786 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.486804 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.486833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.486851 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.495151 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.512535 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.524707 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.538508 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.556911 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.574419 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.590591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.590693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.590712 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.590740 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.590758 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.593249 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:41Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.672090 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.672435 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:02:13.672381535 +0000 UTC m=+85.833104252 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.672610 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.672744 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.672853 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.672906 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.672986 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:02:13.672951441 +0000 UTC m=+85.833674168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.673030 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:02:13.673011752 +0000 UTC m=+85.833734689 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.700318 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.700391 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.700411 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.700501 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.700532 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.774229 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.774325 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774519 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774566 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774580 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774679 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:02:13.774655258 +0000 UTC m=+85.935377935 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774519 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774787 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774821 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.774934 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:02:13.774900435 +0000 UTC m=+85.935623152 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.804010 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.804068 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.804081 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.804099 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.804113 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.808303 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.808398 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.808453 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.808410 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.808555 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.808606 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.808821 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:41 crc kubenswrapper[4959]: E1007 13:01:41.808947 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.907811 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.907872 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.907891 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.907915 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:41 crc kubenswrapper[4959]: I1007 13:01:41.907933 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:41Z","lastTransitionTime":"2025-10-07T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.012053 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.012086 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.012095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.012111 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.012121 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.026166 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.047072 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.064612 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.088422 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.115749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.115797 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.115807 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.115828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.115841 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.133627 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.157911 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.184146 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.195465 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.204738 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.218892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.218952 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.218966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.218989 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.219007 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.218941 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.234601 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.249464 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.260791 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.274119 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.289339 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.307200 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322026 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322094 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322106 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.322580 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.339428 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.425098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.425618 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.425635 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.425679 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.425700 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.529591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.529719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.529744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.529776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.529796 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.633278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.633361 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.633378 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.633412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.633431 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.736809 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.736880 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.736904 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.736937 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.736962 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.841235 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.841311 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.841331 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.841361 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.841383 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.945067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.945128 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.945140 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.945162 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:42 crc kubenswrapper[4959]: I1007 13:01:42.945178 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:42Z","lastTransitionTime":"2025-10-07T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.053230 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.053300 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.053315 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.053343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.053367 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.160484 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.160769 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.160839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.160911 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.160974 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.263967 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.264271 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.264351 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.264424 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.264542 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.366733 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.366794 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.366811 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.366833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.366849 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.470600 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.470702 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.470724 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.470750 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.470772 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.574485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.575180 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.575344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.575507 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.575702 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.679281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.679367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.679390 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.679420 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.679443 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.782782 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.783240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.783386 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.783522 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.783690 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.808220 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.808240 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.808377 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:43 crc kubenswrapper[4959]: E1007 13:01:43.808536 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:43 crc kubenswrapper[4959]: E1007 13:01:43.808746 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:43 crc kubenswrapper[4959]: E1007 13:01:43.808970 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.809380 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:43 crc kubenswrapper[4959]: E1007 13:01:43.809736 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.886681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.886757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.886783 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.886821 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.886847 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.991096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.991186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.991211 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.991247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:43 crc kubenswrapper[4959]: I1007 13:01:43.991270 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:43Z","lastTransitionTime":"2025-10-07T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.095132 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.095180 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.095193 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.095216 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.095231 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.198847 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.198930 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.198949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.198981 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.199015 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.303003 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.303079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.303141 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.303173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.303196 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.407231 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.407316 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.407334 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.407364 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.407387 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.510972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.511051 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.511077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.511109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.511133 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.615546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.615678 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.615718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.615759 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.615792 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.720056 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.720131 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.720153 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.720186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.720207 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.823492 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.823561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.823583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.823613 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.823694 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.929437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.929485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.929501 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.929525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:44 crc kubenswrapper[4959]: I1007 13:01:44.929539 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:44Z","lastTransitionTime":"2025-10-07T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.033190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.033372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.033400 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.033435 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.033457 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.136983 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.137048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.137064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.137096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.137114 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.241225 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.241287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.241301 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.241322 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.241335 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.346503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.346568 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.346583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.347295 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.347313 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.450994 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.451062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.451076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.451099 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.451117 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.554859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.554940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.554962 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.554992 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.555013 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.658210 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.658291 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.658310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.658339 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.658362 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.761233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.761307 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.761321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.761342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.761359 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.808655 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.808689 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.808767 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.808827 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:45 crc kubenswrapper[4959]: E1007 13:01:45.808984 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:45 crc kubenswrapper[4959]: E1007 13:01:45.809163 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:45 crc kubenswrapper[4959]: E1007 13:01:45.809279 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:45 crc kubenswrapper[4959]: E1007 13:01:45.809466 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.864165 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.864232 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.864255 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.864284 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.864305 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.967212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.967281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.967299 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.967326 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:45 crc kubenswrapper[4959]: I1007 13:01:45.967348 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:45Z","lastTransitionTime":"2025-10-07T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.070917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.070978 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.070996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.071022 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.071042 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.174918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.174971 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.174988 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.175013 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.175032 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.278743 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.278833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.278860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.278901 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.278929 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.383895 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.383995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.384021 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.384057 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.384082 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.487520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.487799 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.487839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.487874 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.487897 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.591194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.591240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.591249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.591269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.591283 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.695191 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.695243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.695252 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.695269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.695279 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.798399 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.798464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.798476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.798493 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.798509 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.902024 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.902108 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.902137 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.902169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:46 crc kubenswrapper[4959]: I1007 13:01:46.902193 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:46Z","lastTransitionTime":"2025-10-07T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.005719 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.005776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.005794 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.005819 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.005839 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.109577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.109669 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.109680 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.109699 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.109711 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.212963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.213028 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.213046 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.213072 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.213096 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.315967 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.316037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.316053 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.316079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.316104 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.420318 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.420379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.420403 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.420676 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.420710 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.524239 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.524315 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.524341 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.524373 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.524396 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.627569 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.627620 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.627657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.627677 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.627689 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.731584 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.731686 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.731706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.731728 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.731742 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.807839 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:47 crc kubenswrapper[4959]: E1007 13:01:47.808047 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.808133 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.808261 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.808309 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:47 crc kubenswrapper[4959]: E1007 13:01:47.808548 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:47 crc kubenswrapper[4959]: E1007 13:01:47.808712 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:47 crc kubenswrapper[4959]: E1007 13:01:47.808839 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.836139 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.836204 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.836228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.836253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.836269 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.943605 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.943800 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.943857 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.943923 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:47 crc kubenswrapper[4959]: I1007 13:01:47.943958 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:47Z","lastTransitionTime":"2025-10-07T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.047664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.047718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.047733 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.047751 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.047763 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.150041 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.150129 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.150141 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.150163 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.150176 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.254167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.254242 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.254255 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.254280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.254296 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.356700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.356747 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.356759 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.356778 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.356791 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.459575 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.459647 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.459663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.459681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.459698 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.562788 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.562862 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.562875 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.562901 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.562917 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.667294 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.667357 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.667375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.667402 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.667422 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.769739 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.769804 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.769822 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.769851 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.769874 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.831319 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.846421 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.873164 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.873568 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.873686 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.873798 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.873867 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.884762 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.906720 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.928297 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.943025 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.954726 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.968367 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.976437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.976483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.976495 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.976515 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.976526 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:48Z","lastTransitionTime":"2025-10-07T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.981576 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:48 crc kubenswrapper[4959]: I1007 13:01:48.994338 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.009498 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.026426 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.043380 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.056503 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.070385 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.078592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.078697 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.078723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.078756 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.078782 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.084904 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.104482 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.182192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.182246 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.182255 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.182324 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.182339 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.286079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.286134 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.286148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.286169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.286185 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.389050 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.389096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.389108 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.389131 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.389145 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.493754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.493819 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.493840 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.493931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.493950 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.597303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.597350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.597363 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.597382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.597396 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.700037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.700100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.700114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.700138 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.700160 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.803416 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.803490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.803509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.803537 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.803557 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.809363 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.809391 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.809403 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.809531 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:49 crc kubenswrapper[4959]: E1007 13:01:49.809539 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:49 crc kubenswrapper[4959]: E1007 13:01:49.809670 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:49 crc kubenswrapper[4959]: E1007 13:01:49.809748 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:49 crc kubenswrapper[4959]: E1007 13:01:49.809801 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.907249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.907593 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.907716 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.907828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:49 crc kubenswrapper[4959]: I1007 13:01:49.907968 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:49Z","lastTransitionTime":"2025-10-07T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.010949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.010980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.010989 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.011002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.011011 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.113455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.113513 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.113525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.113538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.113547 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.215813 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.216476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.216663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.216777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.216958 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.320582 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.320900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.320963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.321082 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.321159 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.423744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.423793 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.423803 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.423818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.423830 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.526056 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.526109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.526128 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.526158 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.526177 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.628446 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.628481 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.628490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.628504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.628513 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.679159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.679190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.679198 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.679212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.679221 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.692031 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.696010 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.696277 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.696341 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.696410 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.696478 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.709222 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.713056 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.713093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.713101 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.713115 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.713125 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.726570 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.730100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.730167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.730185 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.730210 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.730229 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.743038 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.746995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.747034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.747045 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.747066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.747077 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.758850 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:50Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:50 crc kubenswrapper[4959]: E1007 13:01:50.758965 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.760776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.760811 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.760823 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.760841 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.760852 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.874373 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.874412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.874423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.874441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.874453 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.976917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.976964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.976976 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.976993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:50 crc kubenswrapper[4959]: I1007 13:01:50.977002 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:50Z","lastTransitionTime":"2025-10-07T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.078951 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.079225 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.079337 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.079362 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.079371 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.181251 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.181303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.181318 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.181342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.181359 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.283791 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.283831 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.283840 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.283856 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.283870 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.386000 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.386037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.386048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.386066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.386076 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.488441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.488474 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.488482 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.488496 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.488505 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.591437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.591475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.591485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.591502 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.591513 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.693593 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.693657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.693673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.693711 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.693724 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.799923 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.799966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.799976 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.799994 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.800006 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.807932 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:51 crc kubenswrapper[4959]: E1007 13:01:51.808062 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.808181 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.808262 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:51 crc kubenswrapper[4959]: E1007 13:01:51.808342 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.808531 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:51 crc kubenswrapper[4959]: E1007 13:01:51.808615 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:51 crc kubenswrapper[4959]: E1007 13:01:51.809044 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.902068 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.902107 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.902119 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.902136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:51 crc kubenswrapper[4959]: I1007 13:01:51.902148 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:51Z","lastTransitionTime":"2025-10-07T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.004544 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.004597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.004614 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.004667 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.004688 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.108381 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.108415 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.108425 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.108440 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.108451 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.211343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.211408 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.211424 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.211448 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.211465 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.314229 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.314287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.314302 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.314387 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.314402 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.418064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.418106 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.418116 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.418136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.418147 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.521524 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.521565 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.521577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.521595 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.521609 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.624560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.624660 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.624678 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.624707 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.624724 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.726466 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.726514 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.726528 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.726544 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.726557 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.829555 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.829607 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.829619 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.829661 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.829675 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.931905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.931941 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.931950 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.931966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:52 crc kubenswrapper[4959]: I1007 13:01:52.931976 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:52Z","lastTransitionTime":"2025-10-07T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.034487 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.034540 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.034553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.034570 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.034580 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.138124 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.138167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.138178 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.138197 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.138209 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.240894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.240925 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.240935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.240948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.240956 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.343220 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.343266 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.343279 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.343296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.343311 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.446453 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.446492 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.446503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.446519 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.446531 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.549481 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.549517 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.549525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.549538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.549548 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.651278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.651316 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.651325 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.651342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.651353 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.754014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.754061 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.754072 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.754088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.754099 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.808458 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.808512 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:53 crc kubenswrapper[4959]: E1007 13:01:53.809191 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.808527 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.809265 4959 scope.go:117] "RemoveContainer" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" Oct 07 13:01:53 crc kubenswrapper[4959]: E1007 13:01:53.809316 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:53 crc kubenswrapper[4959]: E1007 13:01:53.808958 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.808535 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:53 crc kubenswrapper[4959]: E1007 13:01:53.809397 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:53 crc kubenswrapper[4959]: E1007 13:01:53.809520 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.856526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.856838 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.856921 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.857017 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.857097 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.959748 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.959876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.959890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.959913 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:53 crc kubenswrapper[4959]: I1007 13:01:53.959930 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:53Z","lastTransitionTime":"2025-10-07T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.062615 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.062677 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.062688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.062711 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.062723 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.165922 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.166172 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.166180 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.166198 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.166208 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.269062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.269142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.269165 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.269196 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.269217 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.372039 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.372087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.372096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.372118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.372128 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.475570 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.475663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.475682 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.475709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.475728 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.579190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.579257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.579275 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.579301 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.579321 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.682894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.682972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.682995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.683023 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.683045 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.786476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.786577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.786592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.786612 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.786642 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.888503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.888532 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.888541 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.888554 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.888562 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.992589 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.992666 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.992685 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.992710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:54 crc kubenswrapper[4959]: I1007 13:01:54.992728 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:54Z","lastTransitionTime":"2025-10-07T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.095760 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.095823 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.095846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.095876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.095900 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.198121 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.198164 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.198177 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.198198 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.198213 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.303386 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.303454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.303468 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.303487 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.303537 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.405837 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.405872 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.405881 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.405896 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.405905 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.508025 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.508052 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.508061 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.508077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.508085 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.611328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.611382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.611392 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.611409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.611421 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.714963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.715013 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.715023 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.715040 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.715057 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.808499 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.808537 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:55 crc kubenswrapper[4959]: E1007 13:01:55.808687 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:55 crc kubenswrapper[4959]: E1007 13:01:55.808846 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.808905 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.808949 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:55 crc kubenswrapper[4959]: E1007 13:01:55.809013 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:55 crc kubenswrapper[4959]: E1007 13:01:55.809513 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.817177 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.817214 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.817228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.817243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.817255 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.919799 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.919846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.919859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.919876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:55 crc kubenswrapper[4959]: I1007 13:01:55.919885 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:55Z","lastTransitionTime":"2025-10-07T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.022325 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.022369 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.022381 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.022397 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.022407 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.125086 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.125139 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.125154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.125174 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.125188 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.228129 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.228200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.228210 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.228227 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.228243 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.330326 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.330367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.330379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.330394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.330402 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.337153 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:56 crc kubenswrapper[4959]: E1007 13:01:56.337324 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:56 crc kubenswrapper[4959]: E1007 13:01:56.337385 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:02:28.337370583 +0000 UTC m=+100.498093260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.433004 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.433053 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.433068 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.433106 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.433117 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.535999 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.536047 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.536060 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.536082 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.536095 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.638662 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.638709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.638723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.638741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.638754 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.741392 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.741445 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.741455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.741473 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.741487 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.845192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.845249 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.845261 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.845287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.845302 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.948458 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.948518 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.948532 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.948553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:56 crc kubenswrapper[4959]: I1007 13:01:56.948565 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:56Z","lastTransitionTime":"2025-10-07T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.051603 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.051673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.051688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.051706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.051719 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.162476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.162546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.162558 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.162581 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.162592 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.265672 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.265725 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.265736 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.265756 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.265770 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.369101 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.369156 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.369171 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.369194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.369238 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.472209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.472259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.472269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.472287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.472298 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.575918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.575962 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.575972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.575992 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.576003 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.679993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.680059 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.680074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.680097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.680112 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.783482 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.783546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.783556 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.783574 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.783583 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.808061 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:57 crc kubenswrapper[4959]: E1007 13:01:57.808532 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.808163 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.808131 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:57 crc kubenswrapper[4959]: E1007 13:01:57.808610 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.808290 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:57 crc kubenswrapper[4959]: E1007 13:01:57.808984 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:57 crc kubenswrapper[4959]: E1007 13:01:57.809040 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.887317 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.887374 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.887383 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.887401 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.887413 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.990756 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.990818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.990832 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.990853 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:57 crc kubenswrapper[4959]: I1007 13:01:57.990864 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:57Z","lastTransitionTime":"2025-10-07T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.094263 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.094314 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.094328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.094345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.094355 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.196910 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.196956 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.196964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.196980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.196990 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.299833 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.299873 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.299882 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.299900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.299910 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.378298 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/0.log" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.378348 4959 generic.go:334] "Generic (PLEG): container finished" podID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" containerID="db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862" exitCode=1 Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.378380 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerDied","Data":"db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.378775 4959 scope.go:117] "RemoveContainer" containerID="db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.393296 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.402088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.402118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.402130 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.402144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.402154 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.409097 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.425147 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.442235 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.457401 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.472343 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.486223 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.500447 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.504259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.504318 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.504329 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.504348 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.504360 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.518671 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.540647 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.558995 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.572284 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.588559 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.605143 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.606799 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.606838 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.606847 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.606865 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.606876 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.617950 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.630872 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.647026 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.709568 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.709608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.709619 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.709651 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.709664 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.812388 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.812427 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.812441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.812459 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.812471 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.823431 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.837381 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.860870 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.873302 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.889066 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.902134 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.913838 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.914939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.915018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.915043 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.915073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.915129 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:58Z","lastTransitionTime":"2025-10-07T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.926035 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.937254 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.950972 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.964865 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.975137 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:58 crc kubenswrapper[4959]: I1007 13:01:58.988260 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.000651 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.014239 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.017319 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.017351 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.017360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.017379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.017392 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.024833 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.035146 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.119508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.119552 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.119561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.119577 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.119589 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.221326 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.221369 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.221377 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.221390 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.221399 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.323227 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.323281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.323293 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.323310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.323322 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.383547 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/0.log" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.383855 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerStarted","Data":"6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.396492 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.409934 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.423342 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.425744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.425789 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.425798 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.425817 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.425826 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.439913 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.455691 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.471935 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.485911 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.503599 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.519848 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.529069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.529126 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.529169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.529195 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.529209 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.533090 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.551404 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.563183 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.577442 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.592751 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.614134 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.639034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.639082 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.639096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.639114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.639129 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.667538 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.685988 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.742205 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.742244 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.742253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.742267 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.742280 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.808493 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.808601 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:01:59 crc kubenswrapper[4959]: E1007 13:01:59.808661 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.808777 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.808855 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:01:59 crc kubenswrapper[4959]: E1007 13:01:59.808797 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:01:59 crc kubenswrapper[4959]: E1007 13:01:59.809154 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:01:59 crc kubenswrapper[4959]: E1007 13:01:59.809046 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.844907 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.844961 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.844972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.844996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.845008 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.947814 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.947884 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.947903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.947932 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:01:59 crc kubenswrapper[4959]: I1007 13:01:59.947949 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:01:59Z","lastTransitionTime":"2025-10-07T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.050663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.050728 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.050744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.050767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.050781 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.153552 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.153613 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.153650 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.153672 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.153684 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.256975 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.257044 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.257057 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.257073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.257082 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.364802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.364845 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.364855 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.364872 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.364884 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.469223 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.469269 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.469287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.469311 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.469327 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.572414 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.572484 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.572505 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.572539 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.572563 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.675463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.675501 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.675512 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.675534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.675543 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.778214 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.778260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.778270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.778286 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.778296 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.880952 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.880993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.881003 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.881018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.881030 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.983969 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.984050 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.984065 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.984087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:00 crc kubenswrapper[4959]: I1007 13:02:00.984100 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:00Z","lastTransitionTime":"2025-10-07T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.087192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.087409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.087422 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.087441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.087454 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.103221 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.108757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.108853 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.108879 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.108915 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.108948 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.127991 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.133310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.133353 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.133364 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.133379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.133389 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.149884 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.154435 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.154504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.154532 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.154565 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.154590 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.167742 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.171731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.171769 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.171785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.171801 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.171813 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.184711 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.184845 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.186739 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.186780 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.186790 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.186804 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.186814 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.289344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.289389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.289401 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.289417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.289428 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.392372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.392416 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.392430 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.392449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.392462 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.494725 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.494802 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.494826 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.494863 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.494889 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.597413 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.597479 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.597496 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.597516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.597529 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.700176 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.700250 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.700263 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.700282 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.700294 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.803171 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.803222 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.803233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.803256 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.803274 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.808495 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.808529 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.808529 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.808587 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.808702 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.808803 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.808835 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:01 crc kubenswrapper[4959]: E1007 13:02:01.809000 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.906586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.906663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.906735 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.906760 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:01 crc kubenswrapper[4959]: I1007 13:02:01.906770 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:01Z","lastTransitionTime":"2025-10-07T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.009216 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.009247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.009257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.009272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.009282 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.112342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.112395 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.112437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.112457 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.112468 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.215785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.215835 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.215846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.215863 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.215872 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.318892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.318942 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.318952 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.318970 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.318980 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.421984 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.422023 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.422037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.422058 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.422072 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.525095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.525148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.525164 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.525186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.525199 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.628748 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.628796 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.628812 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.628837 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.628849 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.731866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.731912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.731921 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.731937 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.731949 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.834786 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.834832 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.834842 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.834859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.834871 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.938994 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.939048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.939062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.939090 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:02 crc kubenswrapper[4959]: I1007 13:02:02.939106 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:02Z","lastTransitionTime":"2025-10-07T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.042430 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.042500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.042514 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.042533 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.042547 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.145523 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.145597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.145610 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.145646 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.145664 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.248797 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.248860 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.248872 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.248890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.248902 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.351538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.351596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.351608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.351647 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.351663 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.454295 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.454329 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.454340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.454356 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.454365 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.557074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.557189 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.557216 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.557256 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.557281 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.660483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.660532 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.660541 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.660557 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.660567 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.762928 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.762963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.762972 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.762989 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.762998 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.808227 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.808266 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.808353 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:03 crc kubenswrapper[4959]: E1007 13:02:03.808396 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.808347 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:03 crc kubenswrapper[4959]: E1007 13:02:03.808509 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:03 crc kubenswrapper[4959]: E1007 13:02:03.808754 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:03 crc kubenswrapper[4959]: E1007 13:02:03.808818 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.865956 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.866024 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.866034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.866051 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.866061 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.968095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.968133 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.968142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.968159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:03 crc kubenswrapper[4959]: I1007 13:02:03.968169 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:03Z","lastTransitionTime":"2025-10-07T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.071290 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.071327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.071342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.071359 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.071369 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.174420 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.174475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.174485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.174500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.174510 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.277210 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.277271 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.277281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.277301 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.277312 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.380271 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.380313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.380324 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.380340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.380352 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.483396 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.483438 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.483447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.483462 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.483474 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.585449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.585491 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.585500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.585515 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.585525 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.688587 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.688714 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.688745 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.688762 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.688772 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.791010 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.791071 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.791082 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.791098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.791112 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.894706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.894753 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.894764 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.894781 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.894792 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.997754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.997799 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.997808 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.997823 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:04 crc kubenswrapper[4959]: I1007 13:02:04.997833 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:04Z","lastTransitionTime":"2025-10-07T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.100688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.100732 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.100741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.100757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.100768 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.203469 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.203530 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.203542 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.203579 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.203595 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.306671 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.306735 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.306754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.306777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.306792 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.409447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.409499 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.409510 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.409533 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.409546 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.514365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.514431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.514455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.514490 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.514513 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.618398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.618463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.618484 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.618510 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.618531 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.721856 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.721930 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.721945 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.721965 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.721978 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.808725 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.808839 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.808879 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.808837 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:05 crc kubenswrapper[4959]: E1007 13:02:05.809000 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:05 crc kubenswrapper[4959]: E1007 13:02:05.809078 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:05 crc kubenswrapper[4959]: E1007 13:02:05.809176 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:05 crc kubenswrapper[4959]: E1007 13:02:05.809326 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.825657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.825717 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.825737 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.825762 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.825781 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.929520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.929586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.929603 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.929666 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:05 crc kubenswrapper[4959]: I1007 13:02:05.929707 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:05Z","lastTransitionTime":"2025-10-07T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.033109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.033156 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.033169 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.033188 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.033203 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.135926 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.135995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.136007 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.136030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.136048 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.239154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.239202 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.239212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.239250 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.239265 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.341831 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.341905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.341926 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.341955 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.341982 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.446282 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.446345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.446368 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.446394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.446414 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.552441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.552533 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.552556 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.552586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.552608 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.655936 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.655986 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.655999 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.656021 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.656036 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.758360 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.758407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.758422 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.758441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.758453 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.861018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.861068 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.861080 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.861097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.861107 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.963320 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.963382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.963394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.963413 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:06 crc kubenswrapper[4959]: I1007 13:02:06.963424 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:06Z","lastTransitionTime":"2025-10-07T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.066310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.066447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.066457 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.066471 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.066481 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.170036 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.170084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.170100 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.170122 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.170134 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.273177 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.273260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.273281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.273310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.273330 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.376337 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.376408 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.376426 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.376456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.376477 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.479502 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.479575 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.479594 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.479620 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.479656 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.583576 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.583638 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.583654 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.583675 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.583687 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.686449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.686509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.686522 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.686542 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.686555 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.789160 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.789200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.789209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.789226 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.789238 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.808740 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.808822 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:07 crc kubenswrapper[4959]: E1007 13:02:07.808874 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.808906 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.808740 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:07 crc kubenswrapper[4959]: E1007 13:02:07.809050 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:07 crc kubenswrapper[4959]: E1007 13:02:07.809118 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:07 crc kubenswrapper[4959]: E1007 13:02:07.809220 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.809767 4959 scope.go:117] "RemoveContainer" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.891018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.891168 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.891184 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.891230 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.891242 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.993995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.994038 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.994048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.994064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:07 crc kubenswrapper[4959]: I1007 13:02:07.994078 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:07Z","lastTransitionTime":"2025-10-07T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.096946 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.097004 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.097017 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.097037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.097050 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.199327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.199377 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.199394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.199417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.199432 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.301768 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.301814 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.301826 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.301847 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.301860 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.404600 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.404666 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.404684 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.404706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.404721 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.419600 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/2.log" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.421868 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.422221 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.435371 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.450355 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.463648 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.478412 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.497140 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.506929 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.506984 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.506999 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.507016 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.507028 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.514442 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.530608 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.549611 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.562103 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.572248 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.586270 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.604040 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.609354 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.609399 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.609412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.609429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.609443 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.617336 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.628232 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.649074 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.662924 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.674397 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.717409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.717479 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.717495 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.717516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.717531 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.819816 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.819889 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.819910 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.819939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.819960 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.827112 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.842281 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.862501 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.876837 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.891468 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.907013 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.922538 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.923377 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.923416 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.923429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.923452 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.923466 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:08Z","lastTransitionTime":"2025-10-07T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.936279 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.951732 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.967497 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.981315 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:08 crc kubenswrapper[4959]: I1007 13:02:08.993368 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.007382 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.019878 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.025807 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.025850 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.025861 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.025883 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.025897 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.033121 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.045279 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.059699 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.128599 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.128931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.129048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.129161 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.129244 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.230976 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.231010 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.231019 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.231033 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.231043 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.333040 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.333078 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.333088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.333112 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.333127 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.426319 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/3.log" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.426926 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/2.log" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.429489 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" exitCode=1 Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.429526 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.429567 4959 scope.go:117] "RemoveContainer" containerID="134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.430401 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:02:09 crc kubenswrapper[4959]: E1007 13:02:09.430666 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.441749 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.442028 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.442285 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.442513 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.443195 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.450603 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.459297 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.469678 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.480512 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.494861 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.504996 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.517416 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.528983 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.539711 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.545427 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.545456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.545468 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.545486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.545532 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.553003 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.574346 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://134c17d76982dbda3e7392ad3149f6e7bfffe7940cb889f7a9f6776cf9c90629\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:39Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674151 6590 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 13:01:39.674302 6590 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674521 6590 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.674677 6590 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 13:01:39.675201 6590 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:01:39.675275 6590 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:01:39.675328 6590 factory.go:656] Stopping watch factory\\\\nI1007 13:01:39.675348 6590 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:01:39.675393 6590 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:01:39.675408 6590 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:01:39.675421 6590 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:01:39.675513 6590 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:02:08Z\\\",\\\"message\\\":\\\"] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z]\\\\nI1007 13:02:08.643279 6949 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.588860 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.601309 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.613874 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.626198 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.640714 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.648200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.648273 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.648288 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.648310 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.648326 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.654445 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.751346 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.751777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.751892 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.752002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.752097 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.808226 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.808331 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.808381 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.808411 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:09 crc kubenswrapper[4959]: E1007 13:02:09.808965 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:09 crc kubenswrapper[4959]: E1007 13:02:09.809179 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:09 crc kubenswrapper[4959]: E1007 13:02:09.809228 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:09 crc kubenswrapper[4959]: E1007 13:02:09.809292 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.854283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.854329 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.854338 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.854355 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.854365 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.956808 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.957134 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.957256 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.957350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:09 crc kubenswrapper[4959]: I1007 13:02:09.957438 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:09Z","lastTransitionTime":"2025-10-07T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.060161 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.060212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.060226 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.060241 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.060254 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.164406 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.164456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.164465 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.164483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.164495 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.266536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.266585 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.266596 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.266611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.266621 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.369928 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.370006 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.370030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.370061 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.370085 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.436385 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/3.log" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.443392 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:02:10 crc kubenswrapper[4959]: E1007 13:02:10.443801 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.459359 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.473852 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.473905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.473918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.473940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.473956 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.479963 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.505152 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.529030 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.547603 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.575727 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.576857 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.576909 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.576924 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.576951 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.576968 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.593164 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.618022 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.641857 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.664585 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.679755 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.680086 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.680196 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.680315 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.680417 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.688493 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.710012 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.733020 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:02:08Z\\\",\\\"message\\\":\\\"] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z]\\\\nI1007 13:02:08.643279 6949 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.751363 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.770272 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.782291 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.784683 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.784840 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.784881 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.784924 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.784955 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.792966 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.821577 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.889398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.889475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.889500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.889534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.889560 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.994816 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.994914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.994940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.994980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:10 crc kubenswrapper[4959]: I1007 13:02:10.995007 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:10Z","lastTransitionTime":"2025-10-07T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.098055 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.098120 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.098133 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.098154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.098168 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.200747 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.200792 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.200804 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.200823 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.200835 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.303654 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.303696 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.303705 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.303721 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.303734 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.407476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.407536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.407549 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.407574 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.407597 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.510839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.510893 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.510906 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.510925 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.510958 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.565136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.565233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.565258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.565296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.565323 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.581096 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.585419 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.585456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.585466 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.585483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.585499 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.602338 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.607389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.607446 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.607475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.607509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.607735 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.625289 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.629869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.629920 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.629933 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.629957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.629973 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.650553 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.655200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.655274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.655298 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.655332 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.655357 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.675556 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.675693 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.677146 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.677183 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.677196 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.677219 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.677232 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.779545 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.779599 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.779614 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.779657 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.779676 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.808701 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.808744 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.808709 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.808822 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.808701 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.808906 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.809002 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:11 crc kubenswrapper[4959]: E1007 13:02:11.809081 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.882096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.882141 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.882155 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.882175 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.882188 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.987383 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.987453 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.987467 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.987486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:11 crc kubenswrapper[4959]: I1007 13:02:11.987503 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:11Z","lastTransitionTime":"2025-10-07T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.090944 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.091002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.091021 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.091041 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.091052 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.193973 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.194067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.194087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.194114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.194165 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.297906 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.298274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.298376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.298486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.298588 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.402767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.402853 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.402879 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.402912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.402934 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.507371 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.507431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.507448 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.507471 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.507487 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.611731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.611820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.611846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.611879 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.611900 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.716072 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.716148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.716165 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.716192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.716213 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.819076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.819154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.819173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.819205 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.819225 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.922697 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.922773 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.922787 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.922814 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:12 crc kubenswrapper[4959]: I1007 13:02:12.922827 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:12Z","lastTransitionTime":"2025-10-07T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.027048 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.027126 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.027152 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.027181 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.027201 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.130459 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.130546 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.130572 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.130608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.130682 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.234226 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.234283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.234293 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.234317 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.234330 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.336687 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.336736 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.336747 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.336764 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.336774 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.439423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.439545 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.439580 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.439619 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.439682 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.543443 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.543520 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.543538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.543570 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.543590 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.647497 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.647551 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.647563 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.647584 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.647599 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.736322 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.736694 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.736603215 +0000 UTC m=+149.897325932 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.736867 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.736926 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.737090 4959 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.737183 4959 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.737202 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.737171611 +0000 UTC m=+149.897894318 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.737335 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.737309645 +0000 UTC m=+149.898032352 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.750917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.751000 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.751025 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.751058 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.751083 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.808580 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.808758 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.808865 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.808608 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.809015 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.808907 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.809763 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.809969 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.827664 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.838234 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.838336 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838492 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838542 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838557 4959 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838585 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838614 4959 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838658 4959 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838656 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.838612364 +0000 UTC m=+149.999335281 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:02:13 crc kubenswrapper[4959]: E1007 13:02:13.838731 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.838707266 +0000 UTC m=+149.999430173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.854354 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.854400 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.854409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.854429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.854442 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.958304 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.958399 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.958473 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.958508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:13 crc kubenswrapper[4959]: I1007 13:02:13.958537 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:13Z","lastTransitionTime":"2025-10-07T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.062466 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.062526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.062536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.062556 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.062568 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.166011 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.166076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.166089 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.166112 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.166126 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.270120 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.270196 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.270214 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.270293 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.270327 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.379863 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.380366 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.380428 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.380582 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.380995 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.488349 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.488417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.488435 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.488462 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.488484 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.591950 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.592068 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.592137 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.592173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.592201 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.695912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.695987 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.696007 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.696037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.696061 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.800173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.800227 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.800257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.800276 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.800286 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.904224 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.904293 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.904312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.904341 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:14 crc kubenswrapper[4959]: I1007 13:02:14.904360 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:14Z","lastTransitionTime":"2025-10-07T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.007985 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.008066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.008085 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.008112 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.008131 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.110714 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.110765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.110777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.110797 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.110810 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.213943 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.214002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.214013 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.214031 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.214044 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.316524 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.316590 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.316609 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.316659 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.316675 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.420243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.420349 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.420379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.420417 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.420447 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.523754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.523848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.523869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.523899 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.523920 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.628096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.628198 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.628236 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.628270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.628295 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.731611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.731695 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.731712 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.731740 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.731757 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.807912 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.808011 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.807925 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.807925 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:15 crc kubenswrapper[4959]: E1007 13:02:15.808135 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:15 crc kubenswrapper[4959]: E1007 13:02:15.808281 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:15 crc kubenswrapper[4959]: E1007 13:02:15.808451 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:15 crc kubenswrapper[4959]: E1007 13:02:15.808582 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.835540 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.835616 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.835655 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.835682 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.835700 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.938553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.938705 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.938731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.938765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:15 crc kubenswrapper[4959]: I1007 13:02:15.938791 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:15Z","lastTransitionTime":"2025-10-07T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.042599 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.042746 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.042775 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.042803 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.042830 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.146092 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.146164 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.146186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.146213 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.146233 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.249247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.249304 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.249314 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.249333 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.249348 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.352384 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.352470 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.352489 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.352518 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.352539 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.455339 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.455407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.455419 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.455440 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.455458 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.558024 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.558095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.558110 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.558129 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.558140 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.661191 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.661288 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.661316 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.661365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.661394 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.765629 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.765740 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.765763 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.765798 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.765833 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.874312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.874407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.874438 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.874509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.874548 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.979027 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.979067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.979077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.979093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:16 crc kubenswrapper[4959]: I1007 13:02:16.979107 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:16Z","lastTransitionTime":"2025-10-07T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.081842 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.081885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.081894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.081907 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.081917 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.185787 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.185866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.185885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.185916 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.185939 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.288878 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.288936 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.288950 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.288971 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.288986 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.391081 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.391114 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.391123 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.391136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.391146 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.493233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.493277 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.493286 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.493302 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.493313 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.595902 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.595954 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.595968 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.595987 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.596000 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.698964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.699004 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.699015 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.699037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.699123 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.801447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.801499 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.801516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.801542 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.801561 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.808541 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.808584 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:17 crc kubenswrapper[4959]: E1007 13:02:17.808718 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.808761 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.808839 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:17 crc kubenswrapper[4959]: E1007 13:02:17.808926 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:17 crc kubenswrapper[4959]: E1007 13:02:17.809099 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:17 crc kubenswrapper[4959]: E1007 13:02:17.809255 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.905015 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.905087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.905108 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.905136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:17 crc kubenswrapper[4959]: I1007 13:02:17.905156 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:17Z","lastTransitionTime":"2025-10-07T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.009171 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.009253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.009272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.009303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.009325 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.125306 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.125343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.125375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.125388 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.125397 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.227726 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.227779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.227792 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.227815 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.227828 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.330372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.330450 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.330463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.330504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.330520 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.434780 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.434849 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.434866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.434894 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.434912 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.537597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.537716 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.537736 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.537770 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.537791 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.641560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.641710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.641779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.641809 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.641834 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.745375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.745453 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.745478 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.745518 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.745556 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.833901 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.849063 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.849137 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.849154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.849182 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.849207 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.859930 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.879875 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.905502 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.922429 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.935195 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.951931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.951983 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.951994 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.952011 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.952024 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:18Z","lastTransitionTime":"2025-10-07T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.953115 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.975225 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:18 crc kubenswrapper[4959]: I1007 13:02:18.997335 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:02:08Z\\\",\\\"message\\\":\\\"] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z]\\\\nI1007 13:02:08.643279 6949 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.011603 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899834d0-f245-4816-b2c5-b53e8aa64b2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a4d10fee883718bf74d987b3c350647a4beb21675a8f68736e0921528fb11e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.023747 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.035798 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.056805 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.056850 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.056863 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.056882 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.056894 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.059649 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2708ca95-ea89-4a18-a82c-acac93036574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1eb7fe8a7e924894ea24b92a12e9e43c9d46ed10e8d9155e9bc8eab59a87104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://312125d4d0c09a6a4ab482468a25a2450713e015859407828cd8a49d544c1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418c7ef477647c1476381ce30eeffbf4a196db028153acc2199b9fb9649d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9909669435c6d1210b5023239566b7675267744eea23011e9c4236c72a247a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14abe5ee99dd207caf5bd51d812cf73517636174bab67fd0cda461ad94be1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.071555 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.082366 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.092364 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.101773 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.114707 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.134683 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.159830 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.159879 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.159888 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.159905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.159918 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.263213 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.263286 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.263307 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.263335 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.263354 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.367392 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.367468 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.367485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.367515 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.367532 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.470822 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.470916 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.470935 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.470962 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.470984 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.574839 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.574903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.574913 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.574934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.574945 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.678097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.678201 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.678221 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.678258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.678280 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.780819 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.780866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.780886 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.780908 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.780921 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.809283 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.809317 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.809298 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.809447 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:19 crc kubenswrapper[4959]: E1007 13:02:19.809732 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:19 crc kubenswrapper[4959]: E1007 13:02:19.809885 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:19 crc kubenswrapper[4959]: E1007 13:02:19.810083 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:19 crc kubenswrapper[4959]: E1007 13:02:19.810174 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.883688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.883795 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.883822 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.883858 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.883883 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.987602 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.987709 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.987734 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.987763 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:19 crc kubenswrapper[4959]: I1007 13:02:19.987785 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:19Z","lastTransitionTime":"2025-10-07T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.090451 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.090560 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.090578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.090605 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.090670 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.194074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.194154 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.194173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.194209 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.194229 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.298181 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.298265 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.298283 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.298312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.298333 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.400566 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.400619 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.400671 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.400693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.400715 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.502951 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.503023 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.503045 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.503073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.503090 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.611437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.611545 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.611683 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.611754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.611782 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.715241 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.715312 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.715395 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.715447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.715469 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.818372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.818436 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.818455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.818482 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.818502 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.922410 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.922469 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.922483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.922504 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:20 crc kubenswrapper[4959]: I1007 13:02:20.922518 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:20Z","lastTransitionTime":"2025-10-07T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.026393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.026463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.026483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.026509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.026530 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.129700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.129757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.129777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.129803 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.129819 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.232315 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.232355 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.232365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.232379 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.232388 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.334765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.334829 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.334873 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.334898 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.334916 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.438194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.438243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.438255 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.438271 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.438284 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.541158 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.541345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.541357 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.541376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.541387 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.644973 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.645004 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.645014 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.645028 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.645037 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.748318 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.748399 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.748409 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.748429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.748441 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.808116 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.808161 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.808175 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.808363 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.808522 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.808759 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.808890 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.809047 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.844752 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.844816 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.844834 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.844859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.844879 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.868501 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.876678 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.876766 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.876807 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.876851 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.876878 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.901849 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.907779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.907842 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.907855 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.907880 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.907895 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.926340 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.932253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.932315 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.932328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.932353 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.932367 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.948908 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.953944 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.954034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.954061 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.954129 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.954151 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.969146 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:21Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:21 crc kubenswrapper[4959]: E1007 13:02:21.969299 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.971150 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.971223 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.971244 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.971273 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:21 crc kubenswrapper[4959]: I1007 13:02:21.971297 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:21Z","lastTransitionTime":"2025-10-07T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.074423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.074494 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.074511 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.074539 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.074570 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.176964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.177064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.177084 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.177103 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.177116 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.280281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.280319 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.280327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.280344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.280355 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.383979 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.384050 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.384067 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.384105 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.384119 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.486777 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.486856 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.486869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.486887 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.486898 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.589488 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.589850 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.589943 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.590031 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.590300 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.692776 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.693098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.693173 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.693259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.693323 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.796457 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.796528 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.796553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.796586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.796608 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.899907 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.900321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.900457 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.900616 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:22 crc kubenswrapper[4959]: I1007 13:02:22.900800 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:22Z","lastTransitionTime":"2025-10-07T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.003649 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.003713 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.003725 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.003745 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.003757 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.105820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.105934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.105945 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.105965 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.105977 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.210361 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.210419 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.210438 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.210463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.210481 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.313492 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.313524 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.313533 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.313550 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.313559 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.417073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.417116 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.417128 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.417147 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.417156 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.520896 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.520961 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.520984 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.521016 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.521038 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.623328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.623372 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.623382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.623397 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.623407 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.726388 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.726463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.726481 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.726508 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.726526 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.808586 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.808594 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.808698 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.808724 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:23 crc kubenswrapper[4959]: E1007 13:02:23.808871 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:23 crc kubenswrapper[4959]: E1007 13:02:23.809485 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:23 crc kubenswrapper[4959]: E1007 13:02:23.809593 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:23 crc kubenswrapper[4959]: E1007 13:02:23.809793 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.809984 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:02:23 crc kubenswrapper[4959]: E1007 13:02:23.810318 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.829137 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.829179 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.829194 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.829219 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.829236 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.931914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.931969 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.931984 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.932011 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:23 crc kubenswrapper[4959]: I1007 13:02:23.932028 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:23Z","lastTransitionTime":"2025-10-07T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.035035 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.035083 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.035094 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.035111 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.035122 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.137772 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.137867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.137912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.137939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.137959 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.240368 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.240412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.240427 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.240446 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.240457 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.344343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.344428 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.344452 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.344485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.344507 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.447403 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.447470 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.447487 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.447513 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.447533 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.551592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.551724 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.551753 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.551797 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.551818 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.655354 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.655534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.655558 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.655586 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.655607 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.759344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.759440 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.759475 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.759509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.759533 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.862356 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.862444 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.862480 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.862513 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.862535 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.965550 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.965605 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.965622 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.965678 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:24 crc kubenswrapper[4959]: I1007 13:02:24.965697 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:24Z","lastTransitionTime":"2025-10-07T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.069279 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.069332 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.069350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.069376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.069395 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.172456 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.172528 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.172548 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.172575 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.172595 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.275563 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.275659 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.275679 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.275707 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.275726 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.379822 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.379886 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.379910 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.379943 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.379972 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.483837 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.485009 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.485228 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.485454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.485686 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.588867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.588937 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.588964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.589000 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.589033 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.692204 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.692281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.692305 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.692338 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.692358 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.795643 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.795705 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.795716 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.795741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.795753 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.807911 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.807956 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:25 crc kubenswrapper[4959]: E1007 13:02:25.808191 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:25 crc kubenswrapper[4959]: E1007 13:02:25.808289 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.807978 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:25 crc kubenswrapper[4959]: E1007 13:02:25.808429 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.808969 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:25 crc kubenswrapper[4959]: E1007 13:02:25.809137 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.899260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.899342 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.899366 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.899396 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:25 crc kubenswrapper[4959]: I1007 13:02:25.899419 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:25Z","lastTransitionTime":"2025-10-07T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.002964 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.003018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.003030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.003049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.003063 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.106742 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.106808 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.106818 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.106836 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.106846 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.209986 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.210069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.210098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.210133 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.210160 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.313358 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.313431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.313455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.313500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.313526 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.416525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.416581 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.416601 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.416654 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.416676 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.519904 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.520009 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.520037 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.520066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.520089 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.622981 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.623045 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.623055 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.623077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.623097 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.726931 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.727015 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.727035 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.727069 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.727090 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.829846 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.829917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.829934 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.829962 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.829982 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.933110 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.933175 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.933192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.933219 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:26 crc kubenswrapper[4959]: I1007 13:02:26.933242 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:26Z","lastTransitionTime":"2025-10-07T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.037144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.037232 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.037251 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.037280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.037302 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.141459 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.141553 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.141576 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.141603 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.141660 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.244938 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.245020 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.245047 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.245109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.245130 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.349018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.349076 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.349098 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.349130 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.349156 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.453775 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.453844 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.453862 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.453891 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.453911 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.557731 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.557993 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.558073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.558107 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.558136 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.662663 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.662718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.662730 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.662746 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.662756 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.768727 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.768788 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.768807 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.768840 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.768866 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.808090 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.808109 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:27 crc kubenswrapper[4959]: E1007 13:02:27.808312 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.808134 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:27 crc kubenswrapper[4959]: E1007 13:02:27.808387 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:27 crc kubenswrapper[4959]: E1007 13:02:27.808456 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.808109 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:27 crc kubenswrapper[4959]: E1007 13:02:27.808766 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.871389 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.871430 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.871441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.871459 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.871472 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.974199 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.974285 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.974296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.974313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:27 crc kubenswrapper[4959]: I1007 13:02:27.974323 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:27Z","lastTransitionTime":"2025-10-07T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.076026 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.076062 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.076074 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.076092 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.076104 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.177847 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.177873 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.177882 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.177895 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.177904 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.280419 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.280464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.280474 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.280489 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.280499 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.382595 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.382664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.382677 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.382693 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.382706 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.410414 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:28 crc kubenswrapper[4959]: E1007 13:02:28.410603 4959 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:02:28 crc kubenswrapper[4959]: E1007 13:02:28.410693 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs podName:ed03c94e-16fb-42f7-8383-ac7c2c403298 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:32.410672815 +0000 UTC m=+164.571395502 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs") pod "network-metrics-daemon-g57ch" (UID: "ed03c94e-16fb-42f7-8383-ac7c2c403298") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.485380 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.485428 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.485437 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.485454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.485465 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.587940 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.587981 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.587992 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.588007 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.588018 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.691412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.691485 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.691509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.691543 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.691565 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.794303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.794354 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.794366 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.794383 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.794394 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.822158 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g57ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed03c94e-16fb-42f7-8383-ac7c2c403298\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pd5bk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g57ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.834227 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ln4wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072d4bea-8eb2-4f2c-b4c9-6c3b43fc68b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d99652a57705c022c6b5f5645fe18185e8168d671b9307a3e23775826ae5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsc6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ln4wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.851987 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.869705 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:01:57Z\\\",\\\"message\\\":\\\"2025-10-07T13:01:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2\\\\n2025-10-07T13:01:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a53aea7-60e1-4f3c-b661-582963a9bbf2 to /host/opt/cni/bin/\\\\n2025-10-07T13:01:12Z [verbose] multus-daemon started\\\\n2025-10-07T13:01:12Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:01:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.884143 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0ae775a73ecfd4b5cdfd3d41c86d0a49679e71a6bd0f96a9a4b5b0063fff4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.896380 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.896414 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.896423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.896438 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.896449 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.904854 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w62d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c3422b-6d08-4084-835f-3c6eeb42e474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb60e30fd63639aec13a82851682290d63153cc32f8a1d260ca0ac46134f97eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2894448d0a59f0a53cdd23ecbb30af63fcf029a24e1447711e131d920d8aee5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e22ccae21a17073ca9cf03871aefe690e352e8d223336380287edf18fa66ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85f1ef1246fb1a2170a257a584e3286a9427947e7a8760e0f5b69f10f2f31e59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1354ea61735d2c3ddf109a2835509c9ced32814687d8f7cf59493f5a2d1377f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb2c84ef023a28f9cd18b84a33058156e14043b7088f00ea429a9a800b255359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15d34983750643746b3e987fdf13ef7087787b026c411b9aaefdde7d582fd803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gll64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w62d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.916552 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"998ce932-909a-4460-868b-149812f4c695\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60901e44033105fc76f08446bd88a0433e2a02bf23e308a10faf6f7ac61524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a8d0e8db88e103c85ff4ceeab416e92a3cd2a387118143c9710ef6d0fde541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f68f2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4tbb5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.930741 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"130d9f66-f495-4713-b37d-774af84975d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bdb046dab882f905fbe3419aa51802e0e30d759882bca4498d55c844d0f613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cda170626479675932d83e2b40b694cfb9955b22d74ab1bb11c95bdcb73d929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b68d0dd13c0bc62c3aeb01ccd48c163ed4342c35f7a041c23face4f7692cd08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://667830fd4bd0a3bb0c7472a01cbb33ca859f1af7763bb0b525198fe1e621b40c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e4269331e65a67777bdba5b1764accca107297b3ddbd8ab35235f358682a9eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"13:01:09.346709 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 13:01:09.350960 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\"\\\\nF1007 13:01:09.365133 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 13:01:09.367395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-499464514/tls.crt::/tmp/serving-cert-499464514/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759842053\\\\\\\\\\\\\\\" (2025-10-07 13:00:53 +0000 UTC to 2025-11-06 13:00:54 +0000 UTC (now=2025-10-07 13:01:09.350760516 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.367505 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367730 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367785 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1007 13:01:09.367881 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1007 13:01:09.367922 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1007 13:01:09.367954 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759842054\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759842054\\\\\\\\\\\\\\\" (2025-10-07 12:00:54 +0000 UTC to 2026-10-07 12:00:54 +0000 UTC (now=2025-10-07 13:01:09.367846086 +0000 UTC))\\\\\\\"\\\\nI1007 13:01:09.368091 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94070d179e176fe9eb654268f46306bca527d28d220aeb120196a3902254d270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4bf5ad304bd24e845d328e86b856f2a5741b6fa1432bbe3e5d2c80eaf44233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.939887 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7825dd3-7967-4121-b78c-2a365b9c9c1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba0041508702acae4921e20d771ca99952369e982fa4da5f0810f71039796b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cadd6c5fd11259687ba23233a867caf36d441e73bac39a722c7434b5975bfc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f9f004ee564f72af1b21c9eaf8c7258e77d6cd7540dcaa0bf6a45508966985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c5d30c44a2cd3b86d074d1679d1dbdae2725c3a418a5ef04c994795a9798b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.951805 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54f1060f5a85ab655cde511bc507a1eb30645560df1c32bd8a907528a315ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c337d2dd6b15c6199a44fdf143610c1e7b891dc240b0224ba208ad00059e6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.967813 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.981311 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029268e61d02dc2eca311e51a90f9b31a87ecebcf68f33a43fbc6f16892a83d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.999072 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.999142 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.999167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.999195 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:28 crc kubenswrapper[4959]: I1007 13:02:28.999215 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:28Z","lastTransitionTime":"2025-10-07T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.009546 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b26fd9a1-4343-4f1c-bef3-764d3c74724a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:02:08Z\\\",\\\"message\\\":\\\"] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:08Z is after 2025-08-24T17:21:41Z]\\\\nI1007 13:02:08.643279 6949 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:02:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jfm8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.023538 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899834d0-f245-4816-b2c5-b53e8aa64b2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a4d10fee883718bf74d987b3c350647a4beb21675a8f68736e0921528fb11e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9936111edf568f371613694e67f670972acf94dea105b3b4c85be8a4f9bb22d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.036802 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d22d0cc-856e-4729-a352-5d84c739b77b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c721d4b2a18ff12b38e309fc6ffc015a56137e0ae2a55d2fcd5a417013de0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc539549cb288616dbccbfa5315786e6fff526dee8591461c276b3768155534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbba24cb32843150f235450260407da820379f67c01c19d53a83fa38bee2855a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2523651bfc03c6ebf27d0d13c78233b9b3a381778e3a437bab49a6d604c35d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.050245 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.075093 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2708ca95-ea89-4a18-a82c-acac93036574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1eb7fe8a7e924894ea24b92a12e9e43c9d46ed10e8d9155e9bc8eab59a87104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://312125d4d0c09a6a4ab482468a25a2450713e015859407828cd8a49d544c1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418c7ef477647c1476381ce30eeffbf4a196db028153acc2199b9fb9649d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9909669435c6d1210b5023239566b7675267744eea23011e9c4236c72a247a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14abe5ee99dd207caf5bd51d812cf73517636174bab67fd0cda461ad94be1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11940cf7b6bb581b5c17a55adcdfc768a6dac7babc1d181ae145fa210b1eb9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46527343daf5429f576fa768cceeba35d1f59062e578476d59c877ed93083c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7820a50f5329505372fec026c5a882e1c12407a4d79af5eba3b9ad99fa48c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:00:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:00:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.087106 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cbefab5-1f50-4f44-9163-479625fa11a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d2b92455d360668b3ede8bb8d4f2f2d79281afb56436e4ce3091caee974c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gz4bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dgmtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.097877 4959 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xjp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024b990f-a1e8-4ebf-9d60-48afd626881d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e54f44ac25f88c160c33714eb840625058d7990ade630d99805490a452ab2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zdrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:01:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xjp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.101723 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.101757 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.101767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.101785 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.101795 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.204506 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.204550 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.204565 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.204587 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.204604 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.306800 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.306843 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.306852 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.306868 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.306879 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.409866 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.409923 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.409938 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.409963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.409978 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.511905 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.511955 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.511970 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.511989 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.512001 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.613721 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.613771 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.613784 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.613803 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.613816 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.716026 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.716079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.716089 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.716106 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.716120 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.807814 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.807814 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.807827 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:29 crc kubenswrapper[4959]: E1007 13:02:29.808166 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:29 crc kubenswrapper[4959]: E1007 13:02:29.807963 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.807839 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:29 crc kubenswrapper[4959]: E1007 13:02:29.808245 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:29 crc kubenswrapper[4959]: E1007 13:02:29.808343 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.819306 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.819368 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.819393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.819427 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.819451 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.922650 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.922697 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.922710 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.922729 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:29 crc kubenswrapper[4959]: I1007 13:02:29.922742 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:29Z","lastTransitionTime":"2025-10-07T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.025738 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.025810 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.025836 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.025869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.025893 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.128511 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.128566 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.128578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.128597 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.128608 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.230950 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.231003 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.231019 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.231044 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.231057 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.333876 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.333941 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.333966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.333999 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.334023 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.437242 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.437298 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.437319 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.437350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.437377 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.540337 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.540382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.540394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.540413 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.540426 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.642865 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.642911 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.642921 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.642936 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.642947 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.745376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.745423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.745441 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.745466 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.745485 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.847987 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.848027 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.848038 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.848054 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.848065 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.950515 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.950576 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.950588 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.950606 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:30 crc kubenswrapper[4959]: I1007 13:02:30.950620 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:30Z","lastTransitionTime":"2025-10-07T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.053257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.053293 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.053303 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.053317 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.053327 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.156812 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.156939 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.156959 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.156991 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.157009 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.260744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.260806 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.260824 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.260852 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.260869 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.364923 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.364990 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.365005 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.365030 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.365045 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.467847 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.467902 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.467916 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.467936 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.467950 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.570809 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.570850 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.570859 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.570875 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.570884 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.674258 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.674331 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.674357 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.674382 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.674399 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.778018 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.778070 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.778079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.778095 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.778106 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.808476 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.808476 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.808504 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.808650 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:31 crc kubenswrapper[4959]: E1007 13:02:31.808877 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:31 crc kubenswrapper[4959]: E1007 13:02:31.809260 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:31 crc kubenswrapper[4959]: E1007 13:02:31.809549 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:31 crc kubenswrapper[4959]: E1007 13:02:31.809829 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.881563 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.881655 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.881673 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.881741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.881765 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.984321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.984365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.984376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.984393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:31 crc kubenswrapper[4959]: I1007 13:02:31.984404 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:31Z","lastTransitionTime":"2025-10-07T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.087464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.087535 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.087557 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.087583 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.087602 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.112192 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.112260 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.112284 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.112313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.112339 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.132317 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.137101 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.137168 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.137206 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.137236 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.137257 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.158524 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.162854 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.162914 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.162928 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.162968 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.162986 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.181543 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.185664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.185750 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.185760 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.185780 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.185791 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.205342 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.210049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.210118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.210143 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.210178 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.210200 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.231182 4959 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b5a2585e-1f5c-45d6-b6d6-2e40f29e327a\\\",\\\"systemUUID\\\":\\\"d0865fee-6f9e-434f-89c6-fcfcb332a933\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:02:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:02:32 crc kubenswrapper[4959]: E1007 13:02:32.231418 4959 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.234155 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.234226 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.234244 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.234270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.234288 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.337305 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.337385 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.337410 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.337440 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.337463 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.440159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.440217 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.440230 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.440250 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.440265 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.547900 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.547946 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.547957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.547975 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.547987 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.650455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.650517 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.650549 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.650574 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.650590 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.752443 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.752528 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.752543 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.752562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.752575 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.854748 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.854828 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.854848 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.854917 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.854939 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.958558 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.958614 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.958664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.958686 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:32 crc kubenswrapper[4959]: I1007 13:02:32.958697 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:32Z","lastTransitionTime":"2025-10-07T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.061609 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.061676 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.061688 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.061706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.061718 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.164771 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.164812 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.164820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.164835 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.164859 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.267714 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.267759 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.267770 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.267786 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.267795 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.371375 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.371445 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.371468 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.371500 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.371524 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.475218 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.475290 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.475309 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.475328 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.475340 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.577924 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.577995 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.578012 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.578039 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.578064 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.680176 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.680296 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.680317 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.680343 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.680364 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.783295 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.783339 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.783350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.783370 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.783381 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.808087 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.808145 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:33 crc kubenswrapper[4959]: E1007 13:02:33.808179 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.808207 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.808248 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:33 crc kubenswrapper[4959]: E1007 13:02:33.808438 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:33 crc kubenswrapper[4959]: E1007 13:02:33.808516 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:33 crc kubenswrapper[4959]: E1007 13:02:33.808989 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.885275 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.885316 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.885327 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.885344 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.885356 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.988115 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.988179 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.988195 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.988212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:33 crc kubenswrapper[4959]: I1007 13:02:33.988227 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:33Z","lastTransitionTime":"2025-10-07T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.091386 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.091445 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.091469 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.091503 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.091525 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.194351 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.194393 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.194402 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.194423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.194433 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.296992 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.297049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.297064 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.297087 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.297104 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.400223 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.400276 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.400292 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.400313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.400326 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.502374 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.502447 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.502460 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.502483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.502496 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.605029 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.605077 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.605088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.605104 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.605113 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.707979 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.708043 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.708055 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.708073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.708088 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.810754 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.811136 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.811148 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.811172 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.811184 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.913332 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.913398 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.913411 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.913453 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:34 crc kubenswrapper[4959]: I1007 13:02:34.913466 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:34Z","lastTransitionTime":"2025-10-07T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.015517 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.015562 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.015571 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.015585 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.015596 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.118863 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.118957 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.118974 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.118996 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.119016 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.221109 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.221199 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.221213 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.221233 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.221246 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.323223 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.323289 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.323313 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.323350 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.323372 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.426464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.426506 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.426516 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.426531 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.426540 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.528844 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.528889 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.528898 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.528912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.528922 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.631340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.631423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.631446 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.631465 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.631477 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.733679 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.733715 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.733725 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.733740 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.733748 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.808533 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.808591 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:35 crc kubenswrapper[4959]: E1007 13:02:35.808703 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.808803 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:35 crc kubenswrapper[4959]: E1007 13:02:35.808840 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.808885 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:35 crc kubenswrapper[4959]: E1007 13:02:35.808975 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:35 crc kubenswrapper[4959]: E1007 13:02:35.809030 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.836225 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.836266 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.836274 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.836287 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.836296 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.938281 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.938321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.938331 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.938345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:35 crc kubenswrapper[4959]: I1007 13:02:35.938355 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:35Z","lastTransitionTime":"2025-10-07T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.040423 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.040476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.040491 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.040512 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.040526 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.143210 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.143340 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.143352 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.143370 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.143382 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.246368 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.246419 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.246434 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.246452 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.246467 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.349457 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.349538 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.349561 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.349591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.349616 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.452176 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.452234 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.452253 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.452277 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.452293 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.555801 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.555885 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.555918 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.555948 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.555971 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.659161 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.659222 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.659239 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.659264 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.659281 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.762138 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.762190 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.762200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.762218 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.762229 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.865247 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.865292 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.865304 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.865321 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.865332 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.969537 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.969612 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.969670 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.969706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:36 crc kubenswrapper[4959]: I1007 13:02:36.969731 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:36Z","lastTransitionTime":"2025-10-07T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.072146 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.072217 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.072240 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.072272 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.072296 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.175608 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.175767 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.175786 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.175813 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.175833 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.279311 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.279376 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.279392 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.279414 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.279433 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.382912 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.382963 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.382980 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.383002 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.383017 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.490526 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.490664 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.491601 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.491775 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.491830 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.594970 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.595042 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.595065 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.595097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.595123 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.698042 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.698079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.698097 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.698118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.698138 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.800922 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.800974 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.800986 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.801005 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.801024 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.808216 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.808251 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.808212 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:37 crc kubenswrapper[4959]: E1007 13:02:37.808354 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:37 crc kubenswrapper[4959]: E1007 13:02:37.808419 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.808450 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:37 crc kubenswrapper[4959]: E1007 13:02:37.808504 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:37 crc kubenswrapper[4959]: E1007 13:02:37.808590 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.903412 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.903455 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.903464 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.903480 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:37 crc kubenswrapper[4959]: I1007 13:02:37.903490 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:37Z","lastTransitionTime":"2025-10-07T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.005949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.006031 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.006041 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.006057 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.006067 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.108611 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.108681 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.108690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.108707 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.108718 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.210849 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.210877 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.210890 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.210903 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.210912 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.313078 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.313112 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.313120 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.313134 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.313145 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.416088 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.416159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.416179 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.416204 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.416224 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.519094 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.519177 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.519200 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.519229 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.519246 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.621978 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.622025 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.622036 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.622054 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.622064 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.725093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.725155 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.725167 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.725186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.725196 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.809270 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:02:38 crc kubenswrapper[4959]: E1007 13:02:38.809570 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jfm8k_openshift-ovn-kubernetes(b26fd9a1-4343-4f1c-bef3-764d3c74724a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.828227 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.828280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.828289 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.828306 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.828319 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.873845 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=25.873824997 podStartE2EDuration="25.873824997s" podCreationTimestamp="2025-10-07 13:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:38.873450336 +0000 UTC m=+111.034173023" watchObservedRunningTime="2025-10-07 13:02:38.873824997 +0000 UTC m=+111.034547674" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.874049 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7xjp6" podStartSLOduration=88.874045463 podStartE2EDuration="1m28.874045463s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:38.832425104 +0000 UTC m=+110.993147821" watchObservedRunningTime="2025-10-07 13:02:38.874045463 +0000 UTC m=+111.034768130" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.916649 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podStartSLOduration=89.916609498 podStartE2EDuration="1m29.916609498s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:38.895151081 +0000 UTC m=+111.055873778" watchObservedRunningTime="2025-10-07 13:02:38.916609498 +0000 UTC m=+111.077332175" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.916944 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b2pc7" podStartSLOduration=89.916936377 podStartE2EDuration="1m29.916936377s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:38.916451184 +0000 UTC m=+111.077173881" watchObservedRunningTime="2025-10-07 13:02:38.916936377 +0000 UTC m=+111.077659054" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.931532 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.931580 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.931592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.931612 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.931658 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:38Z","lastTransitionTime":"2025-10-07T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:38 crc kubenswrapper[4959]: I1007 13:02:38.963133 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ln4wb" podStartSLOduration=89.963115495 podStartE2EDuration="1m29.963115495s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:38.947560894 +0000 UTC m=+111.108283571" watchObservedRunningTime="2025-10-07 13:02:38.963115495 +0000 UTC m=+111.123838172" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.017582 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w62d8" podStartSLOduration=90.017538486 podStartE2EDuration="1m30.017538486s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.016350892 +0000 UTC m=+111.177073589" watchObservedRunningTime="2025-10-07 13:02:39.017538486 +0000 UTC m=+111.178261163" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.034086 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.034146 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.034159 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.034179 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.034192 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.052325 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4tbb5" podStartSLOduration=89.05229939 podStartE2EDuration="1m29.05229939s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.036214495 +0000 UTC m=+111.196937192" watchObservedRunningTime="2025-10-07 13:02:39.05229939 +0000 UTC m=+111.213022087" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.069198 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.069176478 podStartE2EDuration="1m29.069176478s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.052253009 +0000 UTC m=+111.212975696" watchObservedRunningTime="2025-10-07 13:02:39.069176478 +0000 UTC m=+111.229899165" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.096230 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.096210694 podStartE2EDuration="57.096210694s" podCreationTimestamp="2025-10-07 13:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.07028836 +0000 UTC m=+111.231011037" watchObservedRunningTime="2025-10-07 13:02:39.096210694 +0000 UTC m=+111.256933371" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.136407 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.136449 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.136459 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.136474 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.136484 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.184511 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.184487972 podStartE2EDuration="29.184487972s" podCreationTimestamp="2025-10-07 13:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.18371365 +0000 UTC m=+111.344436347" watchObservedRunningTime="2025-10-07 13:02:39.184487972 +0000 UTC m=+111.345210649" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.209755 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.209734677 podStartE2EDuration="1m28.209734677s" podCreationTimestamp="2025-10-07 13:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:39.209617674 +0000 UTC m=+111.370340351" watchObservedRunningTime="2025-10-07 13:02:39.209734677 +0000 UTC m=+111.370457354" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.238699 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.238747 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.238758 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.238773 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.238784 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.341431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.341471 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.341481 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.341496 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.341506 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.444512 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.444591 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.444616 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.444690 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.444718 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.547186 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.547242 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.547257 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.547278 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.547291 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.650429 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.650497 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.650514 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.650536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.650551 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.753736 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.753813 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.753834 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.753869 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.753898 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.808242 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.808323 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.808336 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.808422 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:39 crc kubenswrapper[4959]: E1007 13:02:39.808424 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:39 crc kubenswrapper[4959]: E1007 13:02:39.808555 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:39 crc kubenswrapper[4959]: E1007 13:02:39.808658 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:39 crc kubenswrapper[4959]: E1007 13:02:39.808974 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.856477 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.856509 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.856517 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.856534 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.856544 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.959280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.959367 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.959387 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.959414 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:39 crc kubenswrapper[4959]: I1007 13:02:39.959435 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:39Z","lastTransitionTime":"2025-10-07T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.062166 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.062212 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.062221 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.062237 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.062249 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.164773 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.164821 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.164831 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.164849 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.164859 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.268130 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.268544 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.268621 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.268744 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.268836 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.371430 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.371474 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.371486 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.371505 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.371518 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.474792 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.474870 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.474882 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.474901 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.474913 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.578498 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.578549 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.578559 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.578576 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.578586 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.682365 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.682433 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.682451 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.682480 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.682496 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.786394 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.786450 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.786463 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.786483 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.786495 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.889815 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.889867 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.889877 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.889898 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.889908 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.992991 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.993039 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.993049 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.993066 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:40 crc kubenswrapper[4959]: I1007 13:02:40.993078 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:40Z","lastTransitionTime":"2025-10-07T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.096454 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.096525 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.096545 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.096572 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.096596 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.200911 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.201102 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.201216 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.201259 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.201340 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.305551 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.305700 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.305741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.305779 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.305800 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.408888 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.408932 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.408944 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.408966 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.408982 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.513741 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.514306 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.514536 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.514765 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.514915 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.618280 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.618345 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.618402 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.618431 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.618457 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.721665 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.721718 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.721739 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.721762 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.721773 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.808584 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.808790 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.808872 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:41 crc kubenswrapper[4959]: E1007 13:02:41.808861 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.808886 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:41 crc kubenswrapper[4959]: E1007 13:02:41.809007 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:41 crc kubenswrapper[4959]: E1007 13:02:41.809130 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:41 crc kubenswrapper[4959]: E1007 13:02:41.809336 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.825016 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.825079 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.825096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.825118 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.825134 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.927768 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.928120 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.928243 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.928408 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:41 crc kubenswrapper[4959]: I1007 13:02:41.928545 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:41Z","lastTransitionTime":"2025-10-07T13:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.031237 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.031273 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.031284 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.031302 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.031313 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.134461 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.134530 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.134551 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.134578 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.134595 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.238073 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.238127 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.238144 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.238168 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.238185 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.340949 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.341015 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.341034 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.341058 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.341079 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.445476 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.445571 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.445592 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.445659 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.445688 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.548706 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.549093 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.549270 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.549573 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.549801 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.568023 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.568096 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.568115 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.568147 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.568171 4959 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:02:42Z","lastTransitionTime":"2025-10-07T13:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.635953 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll"] Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.638996 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.641911 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.642373 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.642682 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.642998 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.663287 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/174802ef-5728-4783-a1f2-2f47362c5d85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.663393 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.663440 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174802ef-5728-4783-a1f2-2f47362c5d85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.663499 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.663590 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/174802ef-5728-4783-a1f2-2f47362c5d85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764563 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/174802ef-5728-4783-a1f2-2f47362c5d85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764687 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764741 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174802ef-5728-4783-a1f2-2f47362c5d85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764789 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764899 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.765028 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/174802ef-5728-4783-a1f2-2f47362c5d85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.764917 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/174802ef-5728-4783-a1f2-2f47362c5d85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.766439 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/174802ef-5728-4783-a1f2-2f47362c5d85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.781878 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174802ef-5728-4783-a1f2-2f47362c5d85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.796040 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/174802ef-5728-4783-a1f2-2f47362c5d85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dsbll\" (UID: \"174802ef-5728-4783-a1f2-2f47362c5d85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:42 crc kubenswrapper[4959]: I1007 13:02:42.973425 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.566437 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" event={"ID":"174802ef-5728-4783-a1f2-2f47362c5d85","Type":"ContainerStarted","Data":"abf0b10937e40d73e6a1e1a50014899e45125f1d0f04c6ba9967ba07b3f22124"} Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.566496 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" event={"ID":"174802ef-5728-4783-a1f2-2f47362c5d85","Type":"ContainerStarted","Data":"4046b283f352145ba8aa2d62fe5a0db68f4e0d861cd396cfbaa1debd544d264b"} Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.808575 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.808597 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.808788 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:43 crc kubenswrapper[4959]: E1007 13:02:43.808953 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:43 crc kubenswrapper[4959]: I1007 13:02:43.809009 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:43 crc kubenswrapper[4959]: E1007 13:02:43.809153 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:43 crc kubenswrapper[4959]: E1007 13:02:43.809278 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:43 crc kubenswrapper[4959]: E1007 13:02:43.809340 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.572030 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/1.log" Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.572714 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/0.log" Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.572787 4959 generic.go:334] "Generic (PLEG): container finished" podID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" containerID="6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae" exitCode=1 Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.572838 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerDied","Data":"6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae"} Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.572897 4959 scope.go:117] "RemoveContainer" containerID="db4a6916ff6306d372102484a9e768f1c4f9f622434cf32aa6012b646c730862" Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.573494 4959 scope.go:117] "RemoveContainer" containerID="6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae" Oct 07 13:02:44 crc kubenswrapper[4959]: E1007 13:02:44.573767 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-b2pc7_openshift-multus(07e132b2-5c1c-488e-abf4-bdaf3fcf4f93)\"" pod="openshift-multus/multus-b2pc7" podUID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" Oct 07 13:02:44 crc kubenswrapper[4959]: I1007 13:02:44.596316 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dsbll" podStartSLOduration=95.596293927 podStartE2EDuration="1m35.596293927s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:43.59233637 +0000 UTC m=+115.753059137" watchObservedRunningTime="2025-10-07 13:02:44.596293927 +0000 UTC m=+116.757016624" Oct 07 13:02:45 crc kubenswrapper[4959]: I1007 13:02:45.577919 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/1.log" Oct 07 13:02:45 crc kubenswrapper[4959]: I1007 13:02:45.808224 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:45 crc kubenswrapper[4959]: I1007 13:02:45.808281 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:45 crc kubenswrapper[4959]: I1007 13:02:45.808288 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:45 crc kubenswrapper[4959]: I1007 13:02:45.808413 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:45 crc kubenswrapper[4959]: E1007 13:02:45.808587 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:45 crc kubenswrapper[4959]: E1007 13:02:45.808825 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:45 crc kubenswrapper[4959]: E1007 13:02:45.808967 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:45 crc kubenswrapper[4959]: E1007 13:02:45.809083 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:47 crc kubenswrapper[4959]: I1007 13:02:47.808391 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:47 crc kubenswrapper[4959]: I1007 13:02:47.808391 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:47 crc kubenswrapper[4959]: E1007 13:02:47.809415 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:47 crc kubenswrapper[4959]: I1007 13:02:47.808420 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:47 crc kubenswrapper[4959]: I1007 13:02:47.808415 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:47 crc kubenswrapper[4959]: E1007 13:02:47.809995 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:47 crc kubenswrapper[4959]: E1007 13:02:47.810091 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:47 crc kubenswrapper[4959]: E1007 13:02:47.810144 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:48 crc kubenswrapper[4959]: E1007 13:02:48.845406 4959 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 13:02:48 crc kubenswrapper[4959]: E1007 13:02:48.903548 4959 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:02:49 crc kubenswrapper[4959]: I1007 13:02:49.808738 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:49 crc kubenswrapper[4959]: I1007 13:02:49.808740 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:49 crc kubenswrapper[4959]: I1007 13:02:49.808834 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:49 crc kubenswrapper[4959]: I1007 13:02:49.808845 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:49 crc kubenswrapper[4959]: E1007 13:02:49.809260 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:49 crc kubenswrapper[4959]: E1007 13:02:49.809425 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:49 crc kubenswrapper[4959]: E1007 13:02:49.809534 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:49 crc kubenswrapper[4959]: E1007 13:02:49.809591 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:49 crc kubenswrapper[4959]: I1007 13:02:49.810081 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.562780 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g57ch"] Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.620930 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/3.log" Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.624461 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerStarted","Data":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.624507 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:50 crc kubenswrapper[4959]: E1007 13:02:50.624644 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.624954 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:02:50 crc kubenswrapper[4959]: I1007 13:02:50.648827 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podStartSLOduration=101.648810572 podStartE2EDuration="1m41.648810572s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:02:50.648788862 +0000 UTC m=+122.809511589" watchObservedRunningTime="2025-10-07 13:02:50.648810572 +0000 UTC m=+122.809533239" Oct 07 13:02:51 crc kubenswrapper[4959]: I1007 13:02:51.808081 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:51 crc kubenswrapper[4959]: I1007 13:02:51.808162 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:51 crc kubenswrapper[4959]: E1007 13:02:51.808664 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:51 crc kubenswrapper[4959]: I1007 13:02:51.808169 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:51 crc kubenswrapper[4959]: E1007 13:02:51.808879 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:51 crc kubenswrapper[4959]: E1007 13:02:51.808950 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:52 crc kubenswrapper[4959]: I1007 13:02:52.808296 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:52 crc kubenswrapper[4959]: E1007 13:02:52.808450 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:53 crc kubenswrapper[4959]: I1007 13:02:53.808722 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:53 crc kubenswrapper[4959]: E1007 13:02:53.808934 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:53 crc kubenswrapper[4959]: I1007 13:02:53.809195 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:53 crc kubenswrapper[4959]: E1007 13:02:53.809355 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:53 crc kubenswrapper[4959]: I1007 13:02:53.809745 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:53 crc kubenswrapper[4959]: E1007 13:02:53.809827 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:53 crc kubenswrapper[4959]: E1007 13:02:53.905599 4959 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:02:54 crc kubenswrapper[4959]: I1007 13:02:54.808666 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:54 crc kubenswrapper[4959]: E1007 13:02:54.808842 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:55 crc kubenswrapper[4959]: I1007 13:02:55.807875 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:55 crc kubenswrapper[4959]: I1007 13:02:55.807890 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:55 crc kubenswrapper[4959]: I1007 13:02:55.808023 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:55 crc kubenswrapper[4959]: E1007 13:02:55.808208 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:55 crc kubenswrapper[4959]: E1007 13:02:55.808329 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:55 crc kubenswrapper[4959]: E1007 13:02:55.808756 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:56 crc kubenswrapper[4959]: I1007 13:02:56.808529 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:56 crc kubenswrapper[4959]: E1007 13:02:56.808713 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:57 crc kubenswrapper[4959]: I1007 13:02:57.808737 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:57 crc kubenswrapper[4959]: I1007 13:02:57.808761 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:57 crc kubenswrapper[4959]: E1007 13:02:57.808888 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:02:57 crc kubenswrapper[4959]: E1007 13:02:57.808918 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:57 crc kubenswrapper[4959]: I1007 13:02:57.808845 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:57 crc kubenswrapper[4959]: E1007 13:02:57.809054 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:58 crc kubenswrapper[4959]: I1007 13:02:58.808438 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:02:58 crc kubenswrapper[4959]: E1007 13:02:58.810193 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:02:58 crc kubenswrapper[4959]: E1007 13:02:58.906278 4959 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:02:59 crc kubenswrapper[4959]: I1007 13:02:59.807803 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:02:59 crc kubenswrapper[4959]: I1007 13:02:59.808044 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:02:59 crc kubenswrapper[4959]: E1007 13:02:59.808145 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:02:59 crc kubenswrapper[4959]: I1007 13:02:59.808319 4959 scope.go:117] "RemoveContainer" containerID="6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae" Oct 07 13:02:59 crc kubenswrapper[4959]: I1007 13:02:59.808785 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:02:59 crc kubenswrapper[4959]: E1007 13:02:59.809138 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:02:59 crc kubenswrapper[4959]: E1007 13:02:59.809476 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:03:00 crc kubenswrapper[4959]: I1007 13:03:00.656518 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/1.log" Oct 07 13:03:00 crc kubenswrapper[4959]: I1007 13:03:00.656588 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerStarted","Data":"be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4"} Oct 07 13:03:00 crc kubenswrapper[4959]: I1007 13:03:00.807806 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:00 crc kubenswrapper[4959]: E1007 13:03:00.807940 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:03:01 crc kubenswrapper[4959]: I1007 13:03:01.808053 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:01 crc kubenswrapper[4959]: I1007 13:03:01.808113 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:01 crc kubenswrapper[4959]: I1007 13:03:01.808179 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:01 crc kubenswrapper[4959]: E1007 13:03:01.808224 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:03:01 crc kubenswrapper[4959]: E1007 13:03:01.808369 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:03:01 crc kubenswrapper[4959]: E1007 13:03:01.808459 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:03:02 crc kubenswrapper[4959]: I1007 13:03:02.808124 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:02 crc kubenswrapper[4959]: E1007 13:03:02.808339 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g57ch" podUID="ed03c94e-16fb-42f7-8383-ac7c2c403298" Oct 07 13:03:03 crc kubenswrapper[4959]: I1007 13:03:03.807915 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:03 crc kubenswrapper[4959]: I1007 13:03:03.807941 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:03 crc kubenswrapper[4959]: E1007 13:03:03.808123 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:03:03 crc kubenswrapper[4959]: I1007 13:03:03.807952 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:03 crc kubenswrapper[4959]: E1007 13:03:03.808225 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:03:03 crc kubenswrapper[4959]: E1007 13:03:03.808352 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:03:04 crc kubenswrapper[4959]: I1007 13:03:04.808088 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:04 crc kubenswrapper[4959]: I1007 13:03:04.816059 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 13:03:04 crc kubenswrapper[4959]: I1007 13:03:04.816149 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.808134 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.808205 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.808284 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.810920 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.810992 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.811058 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 13:03:05 crc kubenswrapper[4959]: I1007 13:03:05.811157 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.311820 4959 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.356064 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.357028 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.360926 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.363790 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.364820 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.364963 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b48pv"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.365186 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.374657 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378486 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378723 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378507 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378988 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.379006 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378522 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.378929 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.379408 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.379826 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380052 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380193 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380112 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380155 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380405 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.380513 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.382524 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b99bm"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.383128 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.383319 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.383598 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.383864 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.383982 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.384231 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.384242 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.384790 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.384890 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.385072 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.385192 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.389718 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.390170 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.394496 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396016 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5802c077-db54-4212-936e-4aef4e394099-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396058 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit-dir\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396085 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c09a1b4-7678-463a-9bba-f97153ade5ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396104 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396126 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-config\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396144 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396186 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396212 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4kx\" (UniqueName: \"kubernetes.io/projected/5c09a1b4-7678-463a-9bba-f97153ade5ad-kube-api-access-kq4kx\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396241 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb768738-0e4a-41f4-ba72-2282a201fa5b-serving-cert\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396275 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-encryption-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396309 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbl7w\" (UniqueName: \"kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396337 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396358 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6srw\" (UniqueName: \"kubernetes.io/projected/cb768738-0e4a-41f4-ba72-2282a201fa5b-kube-api-access-r6srw\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396382 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-node-pullsecrets\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396405 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjh7r\" (UniqueName: \"kubernetes.io/projected/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-kube-api-access-jjh7r\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396436 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c09a1b4-7678-463a-9bba-f97153ade5ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396456 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396477 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5802c077-db54-4212-936e-4aef4e394099-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396497 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396515 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-serving-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396551 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396574 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396594 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396613 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396656 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzc4\" (UniqueName: \"kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396679 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396700 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396721 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-image-import-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396742 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396761 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-serving-cert\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396783 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396803 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396829 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ng9v\" (UniqueName: \"kubernetes.io/projected/5802c077-db54-4212-936e-4aef4e394099-kube-api-access-6ng9v\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396850 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8fb29fc-2a2e-4d11-827c-426683acaef5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396875 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht2tk\" (UniqueName: \"kubernetes.io/projected/b8fb29fc-2a2e-4d11-827c-426683acaef5-kube-api-access-ht2tk\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396899 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396921 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtdr\" (UniqueName: \"kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396940 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-trusted-ca\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.396959 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-client\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.397399 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.397599 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-np26j"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.398140 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.399264 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.399450 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.399498 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptjbf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.399807 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.400080 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.400329 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.400497 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.400917 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.401050 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.401238 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.401555 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.401888 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.402259 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.401115 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.402804 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.402923 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.403042 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.403095 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.403227 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.402808 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.403455 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.404664 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.405446 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.410702 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.410917 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411265 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411438 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411544 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411663 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411747 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411876 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.411980 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.412280 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.412318 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.418875 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.419164 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.419747 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.419954 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.419982 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.419168 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.420370 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.420824 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.420955 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.421150 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c5bnk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.423398 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.424228 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.424403 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.432123 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.432242 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.495830 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.496583 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.496653 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.496819 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.496835 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.497058 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.497311 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.497676 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.496934 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498552 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498611 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht2tk\" (UniqueName: \"kubernetes.io/projected/b8fb29fc-2a2e-4d11-827c-426683acaef5-kube-api-access-ht2tk\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498674 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498698 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtdr\" (UniqueName: \"kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498720 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-trusted-ca\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498743 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-client\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498763 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5802c077-db54-4212-936e-4aef4e394099-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498783 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit-dir\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498812 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-images\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0203e72-df97-4a97-8f45-65175f7d9839-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498859 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c09a1b4-7678-463a-9bba-f97153ade5ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498880 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498934 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498954 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-config\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498974 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498985 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.498998 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499025 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4kx\" (UniqueName: \"kubernetes.io/projected/5c09a1b4-7678-463a-9bba-f97153ade5ad-kube-api-access-kq4kx\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499048 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499073 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb768738-0e4a-41f4-ba72-2282a201fa5b-serving-cert\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499108 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-encryption-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499131 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbk55\" (UniqueName: \"kubernetes.io/projected/747a40c2-ac24-4a4d-b444-437ea791dbe1-kube-api-access-kbk55\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499175 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499186 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499210 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/747a40c2-ac24-4a4d-b444-437ea791dbe1-machine-approver-tls\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499234 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbl7w\" (UniqueName: \"kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499273 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499301 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499326 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6srw\" (UniqueName: \"kubernetes.io/projected/cb768738-0e4a-41f4-ba72-2282a201fa5b-kube-api-access-r6srw\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499347 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-node-pullsecrets\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499370 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499390 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499410 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499434 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tckch\" (UniqueName: \"kubernetes.io/projected/4c4b9607-9bba-4c4d-ab60-a3e14de17949-kube-api-access-tckch\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499457 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499481 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjh7r\" (UniqueName: \"kubernetes.io/projected/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-kube-api-access-jjh7r\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499504 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fncc7\" (UniqueName: \"kubernetes.io/projected/893a583d-ded3-4c17-b16c-b8f0e18ace91-kube-api-access-fncc7\") pod \"downloads-7954f5f757-np26j\" (UID: \"893a583d-ded3-4c17-b16c-b8f0e18ace91\") " pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499526 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x968\" (UniqueName: \"kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499565 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499592 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c09a1b4-7678-463a-9bba-f97153ade5ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499599 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499612 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499654 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5802c077-db54-4212-936e-4aef4e394099-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499674 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499694 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-serving-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499717 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-auth-proxy-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499739 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499762 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c4b9607-9bba-4c4d-ab60-a3e14de17949-metrics-tls\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499800 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499826 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499851 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499872 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499893 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-config\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499903 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499919 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499945 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzc4\" (UniqueName: \"kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499968 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499991 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500015 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500038 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500060 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-image-import-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500082 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500103 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-serving-cert\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500129 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500158 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64jhh\" (UniqueName: \"kubernetes.io/projected/d0203e72-df97-4a97-8f45-65175f7d9839-kube-api-access-64jhh\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500183 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500209 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500239 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ng9v\" (UniqueName: \"kubernetes.io/projected/5802c077-db54-4212-936e-4aef4e394099-kube-api-access-6ng9v\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.500266 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8fb29fc-2a2e-4d11-827c-426683acaef5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.503196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.504876 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-trusted-ca\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.506936 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.507066 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.507211 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.499531 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwbtb"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.509028 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.509296 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9rgh8"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.509532 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.509553 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.509896 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.510316 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.510776 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.510961 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.511108 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.511296 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.514482 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-serving-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.515983 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-image-import-ca\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.517049 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5802c077-db54-4212-936e-4aef4e394099-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.517096 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit-dir\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.517664 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.518163 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jn6sf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.518472 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.518566 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.519023 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.519329 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.517665 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c09a1b4-7678-463a-9bba-f97153ade5ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.519946 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.520588 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.521011 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.521526 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.521663 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.522785 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb768738-0e4a-41f4-ba72-2282a201fa5b-serving-cert\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.522114 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-audit\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.521688 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.523161 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.524148 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.524929 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.525216 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.525483 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.525728 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.526459 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb768738-0e4a-41f4-ba72-2282a201fa5b-config\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.527400 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.527673 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-node-pullsecrets\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.528492 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.530178 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.530895 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.530989 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.531231 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.531521 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.531731 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.530924 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.531991 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.532043 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.532415 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wvxrf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.532919 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.533135 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.562256 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.562550 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-encryption-config\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563272 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563267 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8fb29fc-2a2e-4d11-827c-426683acaef5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563334 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c09a1b4-7678-463a-9bba-f97153ade5ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563365 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563378 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563448 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563527 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563678 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563695 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563867 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563895 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.564040 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.564248 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.564675 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.564825 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.563276 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-serving-cert\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.565517 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.565916 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.566046 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.566217 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.566669 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-etcd-client\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.567201 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.567894 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.568324 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.570103 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtdr\" (UniqueName: \"kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr\") pod \"controller-manager-879f6c89f-cptrk\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.570155 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.570334 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.571431 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.571602 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.571987 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.572045 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.572387 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht2tk\" (UniqueName: \"kubernetes.io/projected/b8fb29fc-2a2e-4d11-827c-426683acaef5-kube-api-access-ht2tk\") pod \"cluster-samples-operator-665b6dd947-8qcxk\" (UID: \"b8fb29fc-2a2e-4d11-827c-426683acaef5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.574363 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5802c077-db54-4212-936e-4aef4e394099-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.580034 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjh7r\" (UniqueName: \"kubernetes.io/projected/e7ea9371-6a1a-4aac-8361-f9f68dcdc194-kube-api-access-jjh7r\") pod \"apiserver-76f77b778f-b99bm\" (UID: \"e7ea9371-6a1a-4aac-8361-f9f68dcdc194\") " pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.581766 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.582019 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.582448 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.582649 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.582689 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.585616 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.586930 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.587962 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hmdw2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.588148 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.589011 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.589440 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.589576 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.589971 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.590584 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.590929 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.591540 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.595804 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.596194 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.596256 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.596306 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.598468 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.599281 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptjbf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.600687 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601040 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-service-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601082 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601112 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbk55\" (UniqueName: \"kubernetes.io/projected/747a40c2-ac24-4a4d-b444-437ea791dbe1-kube-api-access-kbk55\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmll\" (UniqueName: \"kubernetes.io/projected/235fc95a-b9be-4d4b-82af-87213702f88d-kube-api-access-svmll\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601149 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601169 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601200 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601219 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmg9r\" (UniqueName: \"kubernetes.io/projected/473f716e-b637-46dd-a9aa-c6fd19b2c12d-kube-api-access-rmg9r\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601235 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601252 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds8hv\" (UniqueName: \"kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601272 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601293 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601311 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d810e619-3771-414f-a7f6-87ab4f186478-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601338 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a82e244-d9ca-4b9f-b309-6a795b1385fd-config\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601358 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-config\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601374 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tckch\" (UniqueName: \"kubernetes.io/projected/4c4b9607-9bba-4c4d-ab60-a3e14de17949-kube-api-access-tckch\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601390 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9mk\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-kube-api-access-zt9mk\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601408 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2ch\" (UniqueName: \"kubernetes.io/projected/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-kube-api-access-dt2ch\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601422 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601440 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x968\" (UniqueName: \"kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601458 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-etcd-client\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601474 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601497 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601515 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601531 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74633d3d-6855-4249-a997-ee82ea68771b-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601548 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601567 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c4b9607-9bba-4c4d-ab60-a3e14de17949-metrics-tls\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601586 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-auth-proxy-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601611 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601661 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601678 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rxj\" (UniqueName: \"kubernetes.io/projected/2f920e09-08a8-49c4-b217-c53a126eb3bf-kube-api-access-l7rxj\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601695 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601722 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601738 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601763 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-config\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601791 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64jhh\" (UniqueName: \"kubernetes.io/projected/d0203e72-df97-4a97-8f45-65175f7d9839-kube-api-access-64jhh\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601807 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-encryption-config\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601822 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xt7\" (UniqueName: \"kubernetes.io/projected/879a5223-92dc-4b9e-9749-15fe196c09dd-kube-api-access-24xt7\") pod \"migrator-59844c95c7-8l4sz\" (UID: \"879a5223-92dc-4b9e-9749-15fe196c09dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601845 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht67\" (UniqueName: \"kubernetes.io/projected/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-kube-api-access-mht67\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601862 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-dir\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601879 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn58g\" (UniqueName: \"kubernetes.io/projected/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-kube-api-access-vn58g\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601894 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d2cce2-47b1-419c-92f4-3c83d499804c-config\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601910 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2277b5ae-22d7-4c52-8660-40b4c4a382b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601925 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f920e09-08a8-49c4-b217-c53a126eb3bf-service-ca-bundle\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601940 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d810e619-3771-414f-a7f6-87ab4f186478-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601955 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601974 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.601993 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602008 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82e244-d9ca-4b9f-b309-6a795b1385fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602024 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw9jm\" (UniqueName: \"kubernetes.io/projected/ea769db2-2681-4d3a-9968-ec18475c4690-kube-api-access-vw9jm\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602043 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a345f3-48cb-42f4-945f-1c373095e97b-proxy-tls\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602067 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602089 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602112 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78d2cce2-47b1-419c-92f4-3c83d499804c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602138 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602153 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea769db2-2681-4d3a-9968-ec18475c4690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602176 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/747a40c2-ac24-4a4d-b444-437ea791dbe1-machine-approver-tls\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602192 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602207 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-srv-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-metrics-certs\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602258 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvqw\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-kube-api-access-gzvqw\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602274 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602292 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w564f\" (UniqueName: \"kubernetes.io/projected/7136b871-7eea-4199-8712-643539681f53-kube-api-access-w564f\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602316 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602332 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a345f3-48cb-42f4-945f-1c373095e97b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602348 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-default-certificate\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602363 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d2cce2-47b1-419c-92f4-3c83d499804c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602381 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602396 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602410 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602424 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-client\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602438 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602455 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602470 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfkh\" (UniqueName: \"kubernetes.io/projected/25a345f3-48cb-42f4-945f-1c373095e97b-kube-api-access-lrfkh\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602485 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602501 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602515 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602534 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fncc7\" (UniqueName: \"kubernetes.io/projected/893a583d-ded3-4c17-b16c-b8f0e18ace91-kube-api-access-fncc7\") pod \"downloads-7954f5f757-np26j\" (UID: \"893a583d-ded3-4c17-b16c-b8f0e18ace91\") " pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602554 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a82e244-d9ca-4b9f-b309-6a795b1385fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602571 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvksv\" (UniqueName: \"kubernetes.io/projected/db426b25-ee7d-4e32-bee8-5ca494b37c06-kube-api-access-bvksv\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602590 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602607 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255sj\" (UniqueName: \"kubernetes.io/projected/e8c54507-e974-490a-97d8-601f70886942-kube-api-access-255sj\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602619 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602651 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fgq8\" (UniqueName: \"kubernetes.io/projected/f8122128-1530-410d-a26b-068922cea39b-kube-api-access-6fgq8\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602742 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-serving-cert\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602777 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8bm\" (UniqueName: \"kubernetes.io/projected/d3f29540-afe4-44a5-af16-a86cb8d700da-kube-api-access-nh8bm\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602824 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74633d3d-6855-4249-a997-ee82ea68771b-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602844 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aba4605a-0c79-443a-a519-6555a215e3aa-serving-cert\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602865 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-config\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602883 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602918 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-policies\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.602959 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603005 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2tn4\" (UniqueName: \"kubernetes.io/projected/2277b5ae-22d7-4c52-8660-40b4c4a382b6-kube-api-access-r2tn4\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603037 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603058 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-serving-cert\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603080 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-tmpfs\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603108 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8122128-1530-410d-a26b-068922cea39b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603146 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603211 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-stats-auth\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603235 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603270 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-images\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603294 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0203e72-df97-4a97-8f45-65175f7d9839-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603319 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln8wv\" (UniqueName: \"kubernetes.io/projected/aba4605a-0c79-443a-a519-6555a215e3aa-kube-api-access-ln8wv\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603352 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d682e57-80e9-495e-b81b-d49e5ffda0f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603378 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.603404 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2277b5ae-22d7-4c52-8660-40b4c4a382b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.604933 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.606551 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.607245 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.608309 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.608294 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.608373 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.608473 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-images\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.608484 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0203e72-df97-4a97-8f45-65175f7d9839-config\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.609231 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.609775 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-auth-proxy-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.610311 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b48pv"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.610676 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a40c2-ac24-4a4d-b444-437ea791dbe1-config\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.610989 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.611042 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.611155 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.612002 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/747a40c2-ac24-4a4d-b444-437ea791dbe1-machine-approver-tls\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.612053 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.612133 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b99bm"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.612149 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.612177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c4b9607-9bba-4c4d-ab60-a3e14de17949-metrics-tls\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.613775 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.613944 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0203e72-df97-4a97-8f45-65175f7d9839-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.614218 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.614266 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.614814 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwbtb"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.618222 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hszvw"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.620522 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.620461 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.621272 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.622810 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.624078 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.624504 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9rgh8"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.624744 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.625796 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.627027 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.628262 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.630460 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.630491 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.631959 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.633137 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-np26j"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.634538 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wvxrf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.635374 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.637128 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.639421 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.641443 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.651870 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.653833 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.655131 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c5bnk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.656092 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.657297 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hmdw2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.658412 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.659552 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.660881 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.661395 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.661571 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.662614 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbzd2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.664682 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.664877 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.664895 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.665668 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbzd2"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.666661 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hszvw"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.668070 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qg97s"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.668890 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.669266 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qg97s"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.681423 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.684157 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n8hv8"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.684876 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.701091 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.703930 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.703964 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-srv-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.703982 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704003 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea769db2-2681-4d3a-9968-ec18475c4690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704025 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704044 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w564f\" (UniqueName: \"kubernetes.io/projected/7136b871-7eea-4199-8712-643539681f53-kube-api-access-w564f\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704062 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-metrics-certs\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704082 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvqw\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-kube-api-access-gzvqw\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704111 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a345f3-48cb-42f4-945f-1c373095e97b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704133 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-default-certificate\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704153 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d2cce2-47b1-419c-92f4-3c83d499804c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704171 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704186 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-client\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704201 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704217 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfkh\" (UniqueName: \"kubernetes.io/projected/25a345f3-48cb-42f4-945f-1c373095e97b-kube-api-access-lrfkh\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704248 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704270 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704289 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704307 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a82e244-d9ca-4b9f-b309-6a795b1385fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704323 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvksv\" (UniqueName: \"kubernetes.io/projected/db426b25-ee7d-4e32-bee8-5ca494b37c06-kube-api-access-bvksv\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704340 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704355 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255sj\" (UniqueName: \"kubernetes.io/projected/e8c54507-e974-490a-97d8-601f70886942-kube-api-access-255sj\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704372 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fgq8\" (UniqueName: \"kubernetes.io/projected/f8122128-1530-410d-a26b-068922cea39b-kube-api-access-6fgq8\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704388 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-serving-cert\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704403 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8bm\" (UniqueName: \"kubernetes.io/projected/d3f29540-afe4-44a5-af16-a86cb8d700da-kube-api-access-nh8bm\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704419 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74633d3d-6855-4249-a997-ee82ea68771b-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aba4605a-0c79-443a-a519-6555a215e3aa-serving-cert\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704452 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-policies\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704467 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704483 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tn4\" (UniqueName: \"kubernetes.io/projected/2277b5ae-22d7-4c52-8660-40b4c4a382b6-kube-api-access-r2tn4\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704498 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-serving-cert\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704513 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-tmpfs\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704533 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8122128-1530-410d-a26b-068922cea39b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704555 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704573 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-stats-auth\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704591 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704607 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln8wv\" (UniqueName: \"kubernetes.io/projected/aba4605a-0c79-443a-a519-6555a215e3aa-kube-api-access-ln8wv\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704647 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d682e57-80e9-495e-b81b-d49e5ffda0f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704666 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704680 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2277b5ae-22d7-4c52-8660-40b4c4a382b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704696 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-service-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704713 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704735 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svmll\" (UniqueName: \"kubernetes.io/projected/235fc95a-b9be-4d4b-82af-87213702f88d-kube-api-access-svmll\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704752 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704769 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704786 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704801 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmg9r\" (UniqueName: \"kubernetes.io/projected/473f716e-b637-46dd-a9aa-c6fd19b2c12d-kube-api-access-rmg9r\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704817 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704836 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds8hv\" (UniqueName: \"kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704857 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704873 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d810e619-3771-414f-a7f6-87ab4f186478-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704916 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9mk\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-kube-api-access-zt9mk\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704935 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a82e244-d9ca-4b9f-b309-6a795b1385fd-config\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704952 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-config\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.704968 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2ch\" (UniqueName: \"kubernetes.io/projected/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-kube-api-access-dt2ch\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.705858 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-policies\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.706235 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.706818 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-tmpfs\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.706890 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2277b5ae-22d7-4c52-8660-40b4c4a382b6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.706996 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.707380 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.707504 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a345f3-48cb-42f4-945f-1c373095e97b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.707835 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708085 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-etcd-client\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708112 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708247 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74633d3d-6855-4249-a997-ee82ea68771b-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708309 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708350 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708375 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rxj\" (UniqueName: \"kubernetes.io/projected/2f920e09-08a8-49c4-b217-c53a126eb3bf-kube-api-access-l7rxj\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708399 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708417 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708440 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-config\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708462 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-encryption-config\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708480 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xt7\" (UniqueName: \"kubernetes.io/projected/879a5223-92dc-4b9e-9749-15fe196c09dd-kube-api-access-24xt7\") pod \"migrator-59844c95c7-8l4sz\" (UID: \"879a5223-92dc-4b9e-9749-15fe196c09dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708518 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht67\" (UniqueName: \"kubernetes.io/projected/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-kube-api-access-mht67\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708537 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-dir\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708565 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn58g\" (UniqueName: \"kubernetes.io/projected/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-kube-api-access-vn58g\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708583 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d2cce2-47b1-419c-92f4-3c83d499804c-config\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708601 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f920e09-08a8-49c4-b217-c53a126eb3bf-service-ca-bundle\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708611 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f29540-afe4-44a5-af16-a86cb8d700da-audit-dir\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708619 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d810e619-3771-414f-a7f6-87ab4f186478-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708780 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2277b5ae-22d7-4c52-8660-40b4c4a382b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708812 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708839 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82e244-d9ca-4b9f-b309-6a795b1385fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708841 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74633d3d-6855-4249-a997-ee82ea68771b-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708864 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw9jm\" (UniqueName: \"kubernetes.io/projected/ea769db2-2681-4d3a-9968-ec18475c4690-kube-api-access-vw9jm\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708895 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a345f3-48cb-42f4-945f-1c373095e97b-proxy-tls\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708929 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.708946 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78d2cce2-47b1-419c-92f4-3c83d499804c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.709572 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-config\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.709956 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74633d3d-6855-4249-a997-ee82ea68771b-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.710306 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aba4605a-0c79-443a-a519-6555a215e3aa-serving-cert\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.710489 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-etcd-client\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.710896 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-encryption-config\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.711273 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8122128-1530-410d-a26b-068922cea39b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.713002 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2277b5ae-22d7-4c52-8660-40b4c4a382b6-serving-cert\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.713038 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f29540-afe4-44a5-af16-a86cb8d700da-serving-cert\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.718830 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.726240 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.731969 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.741299 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.747431 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba4605a-0c79-443a-a519-6555a215e3aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.762066 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.781722 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.801080 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.813196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-serving-cert\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.821456 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.829320 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.840017 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8c54507-e974-490a-97d8-601f70886942-etcd-client\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.844210 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.861383 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.868926 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.881694 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.886154 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-etcd-service-ca\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.899991 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c54507-e974-490a-97d8-601f70886942-config\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.901309 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.922094 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.944858 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.965288 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.969760 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d810e619-3771-414f-a7f6-87ab4f186478-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.977753 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.978513 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b99bm"] Oct 07 13:03:13 crc kubenswrapper[4959]: W1007 13:03:13.985969 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ea9371_6a1a_4aac_8361_f9f68dcdc194.slice/crio-ac9b735fa3b0d838022fa18b8e77c68af4611229d9b7570954d81ffe2c7d321b WatchSource:0}: Error finding container ac9b735fa3b0d838022fa18b8e77c68af4611229d9b7570954d81ffe2c7d321b: Status 404 returned error can't find the container with id ac9b735fa3b0d838022fa18b8e77c68af4611229d9b7570954d81ffe2c7d321b Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.997157 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.997606 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzc4\" (UniqueName: \"kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4\") pod \"route-controller-manager-6576b87f9c-r984j\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:13 crc kubenswrapper[4959]: I1007 13:03:13.999452 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.000465 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 13:03:14 crc kubenswrapper[4959]: W1007 13:03:14.011287 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd888a7_3fd1_4d12_8005_94fdae5be125.slice/crio-1f991f34a4c0bf74977101d05f5dfbdf4957db2fab389c670739c8fbeaf792ce WatchSource:0}: Error finding container 1f991f34a4c0bf74977101d05f5dfbdf4957db2fab389c670739c8fbeaf792ce: Status 404 returned error can't find the container with id 1f991f34a4c0bf74977101d05f5dfbdf4957db2fab389c670739c8fbeaf792ce Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.021381 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.041873 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.053485 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d810e619-3771-414f-a7f6-87ab4f186478-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.062438 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.081877 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.101871 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.145164 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ng9v\" (UniqueName: \"kubernetes.io/projected/5802c077-db54-4212-936e-4aef4e394099-kube-api-access-6ng9v\") pod \"openshift-apiserver-operator-796bbdcf4f-8b86t\" (UID: \"5802c077-db54-4212-936e-4aef4e394099\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.161722 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6srw\" (UniqueName: \"kubernetes.io/projected/cb768738-0e4a-41f4-ba72-2282a201fa5b-kube-api-access-r6srw\") pod \"console-operator-58897d9998-b48pv\" (UID: \"cb768738-0e4a-41f4-ba72-2282a201fa5b\") " pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.161772 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.162067 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 13:03:14 crc kubenswrapper[4959]: W1007 13:03:14.173030 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58bc42fb_61ba_4342_98c6_45535e156eb6.slice/crio-42dbb7bbc09af41386ae22f0fe0451cb7a60da086f44398677247e55a9804a16 WatchSource:0}: Error finding container 42dbb7bbc09af41386ae22f0fe0451cb7a60da086f44398677247e55a9804a16: Status 404 returned error can't find the container with id 42dbb7bbc09af41386ae22f0fe0451cb7a60da086f44398677247e55a9804a16 Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.175572 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-default-certificate\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.184136 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.191342 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-metrics-certs\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.200106 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.202007 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.221063 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.229947 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f920e09-08a8-49c4-b217-c53a126eb3bf-service-ca-bundle\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.234065 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.241927 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.252411 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f920e09-08a8-49c4-b217-c53a126eb3bf-stats-auth\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.278374 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4kx\" (UniqueName: \"kubernetes.io/projected/5c09a1b4-7678-463a-9bba-f97153ade5ad-kube-api-access-kq4kx\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cbbs\" (UID: \"5c09a1b4-7678-463a-9bba-f97153ade5ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.297548 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbl7w\" (UniqueName: \"kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w\") pod \"console-f9d7485db-j72km\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.302106 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.324108 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.339197 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a345f3-48cb-42f4-945f-1c373095e97b-proxy-tls\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.343583 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.362937 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.370080 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea769db2-2681-4d3a-9968-ec18475c4690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.381005 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.388605 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b48pv"] Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.402427 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.418533 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t"] Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.422262 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.441742 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.452247 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d2cce2-47b1-419c-92f4-3c83d499804c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:14 crc kubenswrapper[4959]: W1007 13:03:14.455610 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5802c077_db54_4212_936e_4aef4e394099.slice/crio-ac0d673060a847a7c9ae0e4ae5a9b01c044e81384a1078dba971aebabd4656d5 WatchSource:0}: Error finding container ac0d673060a847a7c9ae0e4ae5a9b01c044e81384a1078dba971aebabd4656d5: Status 404 returned error can't find the container with id ac0d673060a847a7c9ae0e4ae5a9b01c044e81384a1078dba971aebabd4656d5 Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.460779 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.470085 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78d2cce2-47b1-419c-92f4-3c83d499804c-config\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.502395 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.532008 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.539554 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.541762 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.550481 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-srv-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.551282 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.570715 4959 request.go:700] Waited for 1.002369701s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.573246 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.579523 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.580506 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.581249 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.581784 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.602495 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.622769 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.646233 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.649928 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a82e244-d9ca-4b9f-b309-6a795b1385fd-config\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.662420 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.683787 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82e244-d9ca-4b9f-b309-6a795b1385fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.686086 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.703478 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706678 4959 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706679 4959 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706720 4959 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706762 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics podName:85df292a-1000-48f0-be15-823ada38a57b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.206740783 +0000 UTC m=+147.367463460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics") pod "marketplace-operator-79b997595-z65mr" (UID: "85df292a-1000-48f0-be15-823ada38a57b") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706925 4959 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706779 4959 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706904 4959 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706939 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert podName:6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.206917078 +0000 UTC m=+147.367639775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert") pod "packageserver-d55dfcdfc-4wmmh" (UID: "6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.707121 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" event={"ID":"58bc42fb-61ba-4342-98c6-45535e156eb6","Type":"ContainerStarted","Data":"4ef1600272ac8a55eaaf8559867dc70e33c4573f1404af4dbc9ec6695af9c696"} Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.706968 4959 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.707163 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" event={"ID":"58bc42fb-61ba-4342-98c6-45535e156eb6","Type":"ContainerStarted","Data":"42dbb7bbc09af41386ae22f0fe0451cb7a60da086f44398677247e55a9804a16"} Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707222 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls podName:7136b871-7eea-4199-8712-643539681f53 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.207169357 +0000 UTC m=+147.367892044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls") pod "machine-config-operator-74547568cd-6stgg" (UID: "7136b871-7eea-4199-8712-643539681f53") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707267 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert podName:af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.20725691 +0000 UTC m=+147.367979587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-vznnm" (UID: "af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707284 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images podName:7136b871-7eea-4199-8712-643539681f53 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.2072789 +0000 UTC m=+147.368001577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images") pod "machine-config-operator-74547568cd-6stgg" (UID: "7136b871-7eea-4199-8712-643539681f53") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707300 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config podName:235fc95a-b9be-4d4b-82af-87213702f88d nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.207292451 +0000 UTC m=+147.368015128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config") pod "service-ca-operator-777779d784-f6pcc" (UID: "235fc95a-b9be-4d4b-82af-87213702f88d") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707315 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key podName:db426b25-ee7d-4e32-bee8-5ca494b37c06 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.207308181 +0000 UTC m=+147.368030858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key") pod "service-ca-9c57cc56f-hmdw2" (UID: "db426b25-ee7d-4e32-bee8-5ca494b37c06") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.707358 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707589 4959 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707668 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert podName:6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.207647063 +0000 UTC m=+147.368369740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert") pod "packageserver-d55dfcdfc-4wmmh" (UID: "6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.707994 4959 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708046 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config podName:4d682e57-80e9-495e-b81b-d49e5ffda0f7 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.208036456 +0000 UTC m=+147.368759123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" (UID: "4d682e57-80e9-495e-b81b-d49e5ffda0f7") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708105 4959 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708208 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle podName:db426b25-ee7d-4e32-bee8-5ca494b37c06 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.20818506 +0000 UTC m=+147.368907917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle") pod "service-ca-9c57cc56f-hmdw2" (UID: "db426b25-ee7d-4e32-bee8-5ca494b37c06") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708263 4959 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708297 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca podName:85df292a-1000-48f0-be15-823ada38a57b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.208287924 +0000 UTC m=+147.369010821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca") pod "marketplace-operator-79b997595-z65mr" (UID: "85df292a-1000-48f0-be15-823ada38a57b") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708325 4959 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708365 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert podName:473f716e-b637-46dd-a9aa-c6fd19b2c12d nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.208356666 +0000 UTC m=+147.369079563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert") pod "catalog-operator-68c6474976-t7nqt" (UID: "473f716e-b637-46dd-a9aa-c6fd19b2c12d") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708743 4959 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708799 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert podName:4d682e57-80e9-495e-b81b-d49e5ffda0f7 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.20878587 +0000 UTC m=+147.369508557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" (UID: "4d682e57-80e9-495e-b81b-d49e5ffda0f7") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708920 4959 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.708964 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume podName:5b487cdd-8a08-4621-9259-567d66d5cc06 nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.208954036 +0000 UTC m=+147.369676713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume") pod "collect-profiles-29330700-r76vr" (UID: "5b487cdd-8a08-4621-9259-567d66d5cc06") : failed to sync configmap cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.709323 4959 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: E1007 13:03:14.709368 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert podName:235fc95a-b9be-4d4b-82af-87213702f88d nodeName:}" failed. No retries permitted until 2025-10-07 13:03:15.209357999 +0000 UTC m=+147.370080676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert") pod "service-ca-operator-777779d784-f6pcc" (UID: "235fc95a-b9be-4d4b-82af-87213702f88d") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.709426 4959 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-r984j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.709466 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.710088 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" event={"ID":"b8fb29fc-2a2e-4d11-827c-426683acaef5","Type":"ContainerStarted","Data":"e777ba287a64ff08ebd3752af9d28e4c800e5c2804d2c1ef096ae1bac4bf0a61"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.710122 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" event={"ID":"b8fb29fc-2a2e-4d11-827c-426683acaef5","Type":"ContainerStarted","Data":"8bd681a4afd3ebdad046424fe9712eccde46563a9b09b667304334d4fc326642"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.710132 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" event={"ID":"b8fb29fc-2a2e-4d11-827c-426683acaef5","Type":"ContainerStarted","Data":"2c363f3b6073312b97c7a120f3df676b783084d174ab29882779bcf42dae54c0"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.711413 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b48pv" event={"ID":"cb768738-0e4a-41f4-ba72-2282a201fa5b","Type":"ContainerStarted","Data":"d98d2fb37bc35e1042030d1131086c808cf6d82a51edf312f0e26cb111ffee5e"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.712582 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" event={"ID":"6fd888a7-3fd1-4d12-8005-94fdae5be125","Type":"ContainerStarted","Data":"120071c18d87cb632242d156e447b8d2373b30c737f42f0f6b24a6f4f1712edd"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.712612 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" event={"ID":"6fd888a7-3fd1-4d12-8005-94fdae5be125","Type":"ContainerStarted","Data":"1f991f34a4c0bf74977101d05f5dfbdf4957db2fab389c670739c8fbeaf792ce"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.713743 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.715354 4959 generic.go:334] "Generic (PLEG): container finished" podID="e7ea9371-6a1a-4aac-8361-f9f68dcdc194" containerID="591fba3c4d87d541244f41ffbcbddc0036267e3c652d7cdb58aa242bbd6e954f" exitCode=0 Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.715427 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" event={"ID":"e7ea9371-6a1a-4aac-8361-f9f68dcdc194","Type":"ContainerDied","Data":"591fba3c4d87d541244f41ffbcbddc0036267e3c652d7cdb58aa242bbd6e954f"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.715447 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" event={"ID":"e7ea9371-6a1a-4aac-8361-f9f68dcdc194","Type":"ContainerStarted","Data":"ac9b735fa3b0d838022fa18b8e77c68af4611229d9b7570954d81ffe2c7d321b"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.716846 4959 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cptrk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.716878 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.721512 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.722563 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" event={"ID":"5802c077-db54-4212-936e-4aef4e394099","Type":"ContainerStarted","Data":"6642e4d391c5b339bb8efb1df7c6bdc9f8bf1c2d42d242174d1f0103d8655df5"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.722653 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" event={"ID":"5802c077-db54-4212-936e-4aef4e394099","Type":"ContainerStarted","Data":"ac0d673060a847a7c9ae0e4ae5a9b01c044e81384a1078dba971aebabd4656d5"} Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.732426 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs"] Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.747014 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.761164 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.767144 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.781416 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.801315 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.821002 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.841820 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.861983 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.882041 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.901558 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.922853 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.942747 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.961060 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 13:03:14 crc kubenswrapper[4959]: I1007 13:03:14.982181 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.002098 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.022910 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.043178 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.061374 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.080938 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.100838 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.128285 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.141162 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.160708 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.181577 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.201879 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.222088 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.241831 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.243892 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.243951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244005 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244050 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244085 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244149 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244185 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244220 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244294 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244433 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244480 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244511 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244564 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244605 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.244670 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.245285 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-cabundle\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.245520 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.246053 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7136b871-7eea-4199-8712-643539681f53-images\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.249017 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d682e57-80e9-495e-b81b-d49e5ffda0f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.249023 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.251141 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.251451 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db426b25-ee7d-4e32-bee8-5ca494b37c06-signing-key\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.251546 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.253409 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-webhook-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.253653 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/473f716e-b637-46dd-a9aa-c6fd19b2c12d-srv-cert\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.254752 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-apiservice-cert\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.254902 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d682e57-80e9-495e-b81b-d49e5ffda0f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.260841 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.264099 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7136b871-7eea-4199-8712-643539681f53-proxy-tls\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.268980 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/235fc95a-b9be-4d4b-82af-87213702f88d-serving-cert\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.281887 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.285242 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235fc95a-b9be-4d4b-82af-87213702f88d-config\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.301343 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.362716 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbk55\" (UniqueName: \"kubernetes.io/projected/747a40c2-ac24-4a4d-b444-437ea791dbe1-kube-api-access-kbk55\") pod \"machine-approver-56656f9798-vxfg9\" (UID: \"747a40c2-ac24-4a4d-b444-437ea791dbe1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.383246 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tckch\" (UniqueName: \"kubernetes.io/projected/4c4b9607-9bba-4c4d-ab60-a3e14de17949-kube-api-access-tckch\") pod \"dns-operator-744455d44c-ptjbf\" (UID: \"4c4b9607-9bba-4c4d-ab60-a3e14de17949\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.397127 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x968\" (UniqueName: \"kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968\") pod \"oauth-openshift-558db77b4-gct22\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.417879 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64jhh\" (UniqueName: \"kubernetes.io/projected/d0203e72-df97-4a97-8f45-65175f7d9839-kube-api-access-64jhh\") pod \"machine-api-operator-5694c8668f-c5bnk\" (UID: \"d0203e72-df97-4a97-8f45-65175f7d9839\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.437157 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncc7\" (UniqueName: \"kubernetes.io/projected/893a583d-ded3-4c17-b16c-b8f0e18ace91-kube-api-access-fncc7\") pod \"downloads-7954f5f757-np26j\" (UID: \"893a583d-ded3-4c17-b16c-b8f0e18ace91\") " pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.442119 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.459908 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.462651 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.469390 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.478520 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.480652 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.492554 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.501110 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.504749 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.521659 4959 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.541571 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.562099 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.580116 4959 request.go:700] Waited for 1.911013992s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.582456 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.602585 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.621250 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.643277 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.662710 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.682570 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.718984 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2ch\" (UniqueName: \"kubernetes.io/projected/6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e-kube-api-access-dt2ch\") pod \"packageserver-d55dfcdfc-4wmmh\" (UID: \"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.727740 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b48pv" event={"ID":"cb768738-0e4a-41f4-ba72-2282a201fa5b","Type":"ContainerStarted","Data":"fc97d6c1b509a115eaec0c1c11ef84c0dcb9ed80315b23aca23ad4053522d350"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.728456 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.729360 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" event={"ID":"e7ea9371-6a1a-4aac-8361-f9f68dcdc194","Type":"ContainerStarted","Data":"e8ca3137aa5500e3baeedf4c1d8e532748e9276252c3bb31df9ad51ae24269ce"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.729484 4959 patch_prober.go:28] interesting pod/console-operator-58897d9998-b48pv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.729526 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b48pv" podUID="cb768738-0e4a-41f4-ba72-2282a201fa5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.730741 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" event={"ID":"5c09a1b4-7678-463a-9bba-f97153ade5ad","Type":"ContainerStarted","Data":"1936ff5fdff98fab1ccfe17ff0f53efc7edea64ae46722c5cd8928d67a32249f"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.730769 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" event={"ID":"5c09a1b4-7678-463a-9bba-f97153ade5ad","Type":"ContainerStarted","Data":"f15918c76f2357cbd792f13c65cca5f8099f45919f78f05108408518f30fa459"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.732816 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j72km" event={"ID":"dd723ca8-9cfa-465f-b706-feaa015d9e0d","Type":"ContainerStarted","Data":"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.732843 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j72km" event={"ID":"dd723ca8-9cfa-465f-b706-feaa015d9e0d","Type":"ContainerStarted","Data":"69e75aad4d2b43e7600043e5fe2d700df66749853eeaf5ef64ba245d55a469a6"} Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.734010 4959 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cptrk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.734040 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.734085 4959 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-r984j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.734099 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.738247 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a82e244-d9ca-4b9f-b309-6a795b1385fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dm8j2\" (UID: \"2a82e244-d9ca-4b9f-b309-6a795b1385fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.754850 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln8wv\" (UniqueName: \"kubernetes.io/projected/aba4605a-0c79-443a-a519-6555a215e3aa-kube-api-access-ln8wv\") pod \"authentication-operator-69f744f599-cwbtb\" (UID: \"aba4605a-0c79-443a-a519-6555a215e3aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.773288 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w564f\" (UniqueName: \"kubernetes.io/projected/7136b871-7eea-4199-8712-643539681f53-kube-api-access-w564f\") pod \"machine-config-operator-74547568cd-6stgg\" (UID: \"7136b871-7eea-4199-8712-643539681f53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.796315 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d682e57-80e9-495e-b81b-d49e5ffda0f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nnqm8\" (UID: \"4d682e57-80e9-495e-b81b-d49e5ffda0f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.817557 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2tn4\" (UniqueName: \"kubernetes.io/projected/2277b5ae-22d7-4c52-8660-40b4c4a382b6-kube-api-access-r2tn4\") pod \"openshift-config-operator-7777fb866f-rz2xc\" (UID: \"2277b5ae-22d7-4c52-8660-40b4c4a382b6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.831581 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" Oct 07 13:03:15 crc kubenswrapper[4959]: W1007 13:03:15.833433 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod747a40c2_ac24_4a4d_b444_437ea791dbe1.slice/crio-a6d953a9e79601d185a365d97f77fc4cf4f3ba4adb67e54bdf1fae272eb3450a WatchSource:0}: Error finding container a6d953a9e79601d185a365d97f77fc4cf4f3ba4adb67e54bdf1fae272eb3450a: Status 404 returned error can't find the container with id a6d953a9e79601d185a365d97f77fc4cf4f3ba4adb67e54bdf1fae272eb3450a Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.843703 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fgq8\" (UniqueName: \"kubernetes.io/projected/f8122128-1530-410d-a26b-068922cea39b-kube-api-access-6fgq8\") pod \"control-plane-machine-set-operator-78cbb6b69f-w4dpf\" (UID: \"f8122128-1530-410d-a26b-068922cea39b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.857484 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds8hv\" (UniqueName: \"kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv\") pod \"marketplace-operator-79b997595-z65mr\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.879729 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmll\" (UniqueName: \"kubernetes.io/projected/235fc95a-b9be-4d4b-82af-87213702f88d-kube-api-access-svmll\") pod \"service-ca-operator-777779d784-f6pcc\" (UID: \"235fc95a-b9be-4d4b-82af-87213702f88d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.898543 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255sj\" (UniqueName: \"kubernetes.io/projected/e8c54507-e974-490a-97d8-601f70886942-kube-api-access-255sj\") pod \"etcd-operator-b45778765-9rgh8\" (UID: \"e8c54507-e974-490a-97d8-601f70886942\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.917860 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvksv\" (UniqueName: \"kubernetes.io/projected/db426b25-ee7d-4e32-bee8-5ca494b37c06-kube-api-access-bvksv\") pod \"service-ca-9c57cc56f-hmdw2\" (UID: \"db426b25-ee7d-4e32-bee8-5ca494b37c06\") " pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.936326 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmg9r\" (UniqueName: \"kubernetes.io/projected/473f716e-b637-46dd-a9aa-c6fd19b2c12d-kube-api-access-rmg9r\") pod \"catalog-operator-68c6474976-t7nqt\" (UID: \"473f716e-b637-46dd-a9aa-c6fd19b2c12d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.943744 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.953471 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.961465 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfkh\" (UniqueName: \"kubernetes.io/projected/25a345f3-48cb-42f4-945f-1c373095e97b-kube-api-access-lrfkh\") pod \"machine-config-controller-84d6567774-47hzk\" (UID: \"25a345f3-48cb-42f4-945f-1c373095e97b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.973633 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.976899 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.981240 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.984365 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-np26j"] Oct 07 13:03:15 crc kubenswrapper[4959]: I1007 13:03:15.995375 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:15.999999 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9mk\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-kube-api-access-zt9mk\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.003854 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.012861 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.020960 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.030900 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8bm\" (UniqueName: \"kubernetes.io/projected/d3f29540-afe4-44a5-af16-a86cb8d700da-kube-api-access-nh8bm\") pod \"apiserver-7bbb656c7d-j2vw5\" (UID: \"d3f29540-afe4-44a5-af16-a86cb8d700da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.041762 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.043653 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvqw\" (UniqueName: \"kubernetes.io/projected/d810e619-3771-414f-a7f6-87ab4f186478-kube-api-access-gzvqw\") pod \"cluster-image-registry-operator-dc59b4c8b-k2c52\" (UID: \"d810e619-3771-414f-a7f6-87ab4f186478\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.063487 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rxj\" (UniqueName: \"kubernetes.io/projected/2f920e09-08a8-49c4-b217-c53a126eb3bf-kube-api-access-l7rxj\") pod \"router-default-5444994796-jn6sf\" (UID: \"2f920e09-08a8-49c4-b217-c53a126eb3bf\") " pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.080410 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74633d3d-6855-4249-a997-ee82ea68771b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5hgz\" (UID: \"74633d3d-6855-4249-a997-ee82ea68771b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.098924 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xt7\" (UniqueName: \"kubernetes.io/projected/879a5223-92dc-4b9e-9749-15fe196c09dd-kube-api-access-24xt7\") pod \"migrator-59844c95c7-8l4sz\" (UID: \"879a5223-92dc-4b9e-9749-15fe196c09dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.109708 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.117088 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.117506 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn58g\" (UniqueName: \"kubernetes.io/projected/af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15-kube-api-access-vn58g\") pod \"package-server-manager-789f6589d5-vznnm\" (UID: \"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.123144 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.123995 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptjbf"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.142128 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.144572 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht67\" (UniqueName: \"kubernetes.io/projected/26caac7f-4892-4897-9ae4-0b8aab4c0bc8-kube-api-access-mht67\") pod \"olm-operator-6b444d44fb-sks8d\" (UID: \"26caac7f-4892-4897-9ae4-0b8aab4c0bc8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.149389 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.156738 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.166324 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r\") pod \"collect-profiles-29330700-r76vr\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.174351 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.177590 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwbtb"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.177674 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw9jm\" (UniqueName: \"kubernetes.io/projected/ea769db2-2681-4d3a-9968-ec18475c4690-kube-api-access-vw9jm\") pod \"multus-admission-controller-857f4d67dd-wvxrf\" (UID: \"ea769db2-2681-4d3a-9968-ec18475c4690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.181125 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.188071 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.199616 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.201523 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78d2cce2-47b1-419c-92f4-3c83d499804c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhstz\" (UID: \"78d2cce2-47b1-419c-92f4-3c83d499804c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.205416 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.221324 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.244810 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c5bnk"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.286168 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287186 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cc7027-7058-494d-bebf-7f32ae551c27-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287217 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cc7027-7058-494d-bebf-7f32ae551c27-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287283 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287320 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287343 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2682\" (UniqueName: \"kubernetes.io/projected/f6cc7027-7058-494d-bebf-7f32ae551c27-kube-api-access-t2682\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287367 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287391 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287416 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287440 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5l7\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287476 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.287534 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.287874 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:16.787860425 +0000 UTC m=+148.948583102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.288341 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.305329 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.382907 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.385985 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.388494 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389234 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-csi-data-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389273 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cc7027-7058-494d-bebf-7f32ae551c27-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389341 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cc7027-7058-494d-bebf-7f32ae551c27-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389406 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjqqg\" (UniqueName: \"kubernetes.io/projected/6a416b28-ad73-4229-ab18-f467e457330c-kube-api-access-gjqqg\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389465 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-plugins-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389607 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389653 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-config-volume\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389848 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr982\" (UniqueName: \"kubernetes.io/projected/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-kube-api-access-pr982\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.389948 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390017 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2682\" (UniqueName: \"kubernetes.io/projected/f6cc7027-7058-494d-bebf-7f32ae551c27-kube-api-access-t2682\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390074 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390100 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-metrics-tls\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.390262 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:16.890234019 +0000 UTC m=+149.050956696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390354 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlp55\" (UniqueName: \"kubernetes.io/projected/22041b2f-8d5a-4193-98da-808ee1dc9777-kube-api-access-vlp55\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390404 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390519 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a416b28-ad73-4229-ab18-f467e457330c-cert\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390557 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.390608 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-registration-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.394764 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5l7\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.408337 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-certs\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.408866 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-node-bootstrap-token\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.408980 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-socket-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.409031 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-mountpoint-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.409124 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szg6q\" (UniqueName: \"kubernetes.io/projected/8e905bc0-b070-4b86-8122-69922d469d8f-kube-api-access-szg6q\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.409208 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.416143 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.418526 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.422175 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cc7027-7058-494d-bebf-7f32ae551c27-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.430716 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cc7027-7058-494d-bebf-7f32ae551c27-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.447949 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.454465 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.461262 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.486499 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.487769 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2682\" (UniqueName: \"kubernetes.io/projected/f6cc7027-7058-494d-bebf-7f32ae551c27-kube-api-access-t2682\") pod \"kube-storage-version-migrator-operator-b67b599dd-t74ts\" (UID: \"f6cc7027-7058-494d-bebf-7f32ae551c27\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.488839 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5l7\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530431 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-certs\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530483 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-node-bootstrap-token\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530505 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-socket-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530521 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-mountpoint-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530536 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szg6q\" (UniqueName: \"kubernetes.io/projected/8e905bc0-b070-4b86-8122-69922d469d8f-kube-api-access-szg6q\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530577 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530604 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-csi-data-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530638 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjqqg\" (UniqueName: \"kubernetes.io/projected/6a416b28-ad73-4229-ab18-f467e457330c-kube-api-access-gjqqg\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530654 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-plugins-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530690 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-config-volume\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530718 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr982\" (UniqueName: \"kubernetes.io/projected/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-kube-api-access-pr982\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530736 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-metrics-tls\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530754 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlp55\" (UniqueName: \"kubernetes.io/projected/22041b2f-8d5a-4193-98da-808ee1dc9777-kube-api-access-vlp55\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530774 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a416b28-ad73-4229-ab18-f467e457330c-cert\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.530791 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-registration-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.531028 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-registration-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.531682 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-plugins-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.532404 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-config-volume\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.534238 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-socket-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.534314 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-mountpoint-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.534711 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/22041b2f-8d5a-4193-98da-808ee1dc9777-csi-data-dir\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.534974 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.034938308 +0000 UTC m=+149.195660975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.537464 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-node-bootstrap-token\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.540681 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-metrics-tls\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.548551 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e905bc0-b070-4b86-8122-69922d469d8f-certs\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.557492 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a416b28-ad73-4229-ab18-f467e457330c-cert\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.558237 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.584532 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjqqg\" (UniqueName: \"kubernetes.io/projected/6a416b28-ad73-4229-ab18-f467e457330c-kube-api-access-gjqqg\") pod \"ingress-canary-qg97s\" (UID: \"6a416b28-ad73-4229-ab18-f467e457330c\") " pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.599541 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr982\" (UniqueName: \"kubernetes.io/projected/becaf5d5-ad74-4d9e-9b12-fc9399c41fa5-kube-api-access-pr982\") pod \"dns-default-hszvw\" (UID: \"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5\") " pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.617095 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlp55\" (UniqueName: \"kubernetes.io/projected/22041b2f-8d5a-4193-98da-808ee1dc9777-kube-api-access-vlp55\") pod \"csi-hostpathplugin-dbzd2\" (UID: \"22041b2f-8d5a-4193-98da-808ee1dc9777\") " pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.629256 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.634542 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.635319 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.135295796 +0000 UTC m=+149.296018483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.637581 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szg6q\" (UniqueName: \"kubernetes.io/projected/8e905bc0-b070-4b86-8122-69922d469d8f-kube-api-access-szg6q\") pod \"machine-config-server-n8hv8\" (UID: \"8e905bc0-b070-4b86-8122-69922d469d8f\") " pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.644971 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.660007 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n8hv8" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.662121 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.688760 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qg97s" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.736526 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.737386 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.237374671 +0000 UTC m=+149.398097348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.747932 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" event={"ID":"5ed6f47e-1445-40fb-a469-690dc49e5974","Type":"ContainerStarted","Data":"ddbbbbb319785389f2fa4b757f8793f8fa45bc6ea189eb54295b9917597c4639"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.749910 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jn6sf" event={"ID":"2f920e09-08a8-49c4-b217-c53a126eb3bf","Type":"ContainerStarted","Data":"8c5bb37c9a0dae2a3aa2ffbfbe01a08138566ea0aa480767b9b283c1f3c7958b"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.758022 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" event={"ID":"d0203e72-df97-4a97-8f45-65175f7d9839","Type":"ContainerStarted","Data":"1765193c83add8125224c45263b3d1e7872b2a2086f1aad16845530987bd72cf"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.763070 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.766666 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" event={"ID":"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e","Type":"ContainerStarted","Data":"be9a70e6e5f90dd92432d5b472022ded1ea630e4f4d8d25438e53a7d39ed63f4"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.770182 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" event={"ID":"e7ea9371-6a1a-4aac-8361-f9f68dcdc194","Type":"ContainerStarted","Data":"0e5638640058a45d6f3ecd4c5433e05b86f622ac7d538201a58e7c4b5326f4e8"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.771529 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" event={"ID":"2a82e244-d9ca-4b9f-b309-6a795b1385fd","Type":"ContainerStarted","Data":"27d6dd44b12c807c11a0eb86c4c88743166d568147e7c09c9a616d43688b45f5"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.774101 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" event={"ID":"4c4b9607-9bba-4c4d-ab60-a3e14de17949","Type":"ContainerStarted","Data":"ce1bb631a33f955f301cb943b63861528255737d27740a4c1ae89fdfd91bfce1"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.777657 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" event={"ID":"4d682e57-80e9-495e-b81b-d49e5ffda0f7","Type":"ContainerStarted","Data":"543d6c3f1073504202d5f104031f2b571e351a6f9dfbfb937bed06629ab1e3b6"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.783978 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" event={"ID":"aba4605a-0c79-443a-a519-6555a215e3aa","Type":"ContainerStarted","Data":"0146e60c81d865a001b64dc3e7c1980743f3e1daf8e8fae5c62730ddd45884f8"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.785480 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.786728 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-np26j" event={"ID":"893a583d-ded3-4c17-b16c-b8f0e18ace91","Type":"ContainerStarted","Data":"16f151aad4b16dac5d4813d9f06ac02f0122ef2e7c900769b0212228d0864464"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.790467 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" event={"ID":"747a40c2-ac24-4a4d-b444-437ea791dbe1","Type":"ContainerStarted","Data":"b6ba4813ed827c56b14d078f7380838e0f9117c93090aaa3f7d6ee5563b09616"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.790583 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" event={"ID":"747a40c2-ac24-4a4d-b444-437ea791dbe1","Type":"ContainerStarted","Data":"a6d953a9e79601d185a365d97f77fc4cf4f3ba4adb67e54bdf1fae272eb3450a"} Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.791272 4959 patch_prober.go:28] interesting pod/console-operator-58897d9998-b48pv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.791324 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b48pv" podUID="cb768738-0e4a-41f4-ba72-2282a201fa5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.794247 4959 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cptrk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.794346 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 13:03:16 crc kubenswrapper[4959]: W1007 13:03:16.819833 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7136b871_7eea_4199_8712_643539681f53.slice/crio-d402ac9e6b518aab8ecf377097af24651af4bc99cafc781a1da3c9a28e7ba0fc WatchSource:0}: Error finding container d402ac9e6b518aab8ecf377097af24651af4bc99cafc781a1da3c9a28e7ba0fc: Status 404 returned error can't find the container with id d402ac9e6b518aab8ecf377097af24651af4bc99cafc781a1da3c9a28e7ba0fc Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.837380 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.337357527 +0000 UTC m=+149.498080204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.837727 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.838500 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.843590 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.343576583 +0000 UTC m=+149.504299260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.852784 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.859218 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hmdw2"] Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.941056 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.941255 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.441224151 +0000 UTC m=+149.601946828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.944277 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:16 crc kubenswrapper[4959]: E1007 13:03:16.944708 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.444688216 +0000 UTC m=+149.605410893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:16 crc kubenswrapper[4959]: I1007 13:03:16.972965 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-b48pv" podStartSLOduration=127.972948613 podStartE2EDuration="2m7.972948613s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:16.965127314 +0000 UTC m=+149.125850001" watchObservedRunningTime="2025-10-07 13:03:16.972948613 +0000 UTC m=+149.133671290" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.050007 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.052064 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.551977754 +0000 UTC m=+149.712700431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.054149 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.055092 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.555080017 +0000 UTC m=+149.715802694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.074428 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9rgh8"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.078941 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.081295 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.087128 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.103110 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.159445 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.159977 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.659954575 +0000 UTC m=+149.820677252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.261835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.262337 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.76231698 +0000 UTC m=+149.923039657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.270784 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8qcxk" podStartSLOduration=128.27076628 podStartE2EDuration="2m8.27076628s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.238424537 +0000 UTC m=+149.399147214" watchObservedRunningTime="2025-10-07 13:03:17.27076628 +0000 UTC m=+149.431488957" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.362890 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.363035 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.863003659 +0000 UTC m=+150.023726336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.363209 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.363513 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.863494455 +0000 UTC m=+150.024217132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.464824 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.465448 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:17.965426315 +0000 UTC m=+150.126149002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.548277 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2277b5ae_22d7_4c52_8660_40b4c4a382b6.slice/crio-3369a8c3e999fa63180c92fbd6b7e9400c1a5ec52f14c0bd844f1ac87b0bbc33 WatchSource:0}: Error finding container 3369a8c3e999fa63180c92fbd6b7e9400c1a5ec52f14c0bd844f1ac87b0bbc33: Status 404 returned error can't find the container with id 3369a8c3e999fa63180c92fbd6b7e9400c1a5ec52f14c0bd844f1ac87b0bbc33 Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.561076 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d2cce2_47b1_419c_92f4_3c83d499804c.slice/crio-ea0c4b5e1cb0e2280a0549ead89713285fce2950edde7fb92bfb7f2a15ebed75 WatchSource:0}: Error finding container ea0c4b5e1cb0e2280a0549ead89713285fce2950edde7fb92bfb7f2a15ebed75: Status 404 returned error can't find the container with id ea0c4b5e1cb0e2280a0549ead89713285fce2950edde7fb92bfb7f2a15ebed75 Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.571892 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.572344 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.07232533 +0000 UTC m=+150.233048017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.587173 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" podStartSLOduration=128.587145761 podStartE2EDuration="2m8.587145761s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.57082414 +0000 UTC m=+149.731546827" watchObservedRunningTime="2025-10-07 13:03:17.587145761 +0000 UTC m=+149.747868438" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.595440 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.599648 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.643573 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.646900 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbzd2"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.655552 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hszvw"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.691129 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.691665 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.191641477 +0000 UTC m=+150.352364154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.709395 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-j72km" podStartSLOduration=128.709371415 podStartE2EDuration="2m8.709371415s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.685365969 +0000 UTC m=+149.846088656" watchObservedRunningTime="2025-10-07 13:03:17.709371415 +0000 UTC m=+149.870094092" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.718786 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.722532 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.724767 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.727823 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.750497 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr"] Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.765861 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wvxrf"] Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.784273 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1dcec3_3b26_4c6c_ac3f_26acfb5ffb15.slice/crio-c2667754270a81c24fdb0ae9fe9f5509586e71ef0fe5e6584601f247e0adadf8 WatchSource:0}: Error finding container c2667754270a81c24fdb0ae9fe9f5509586e71ef0fe5e6584601f247e0adadf8: Status 404 returned error can't find the container with id c2667754270a81c24fdb0ae9fe9f5509586e71ef0fe5e6584601f247e0adadf8 Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.795752 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.795808 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.801739 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.802246 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.302228354 +0000 UTC m=+150.462951031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.803705 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.815268 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" podStartSLOduration=127.815251116 podStartE2EDuration="2m7.815251116s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.800942982 +0000 UTC m=+149.961665679" watchObservedRunningTime="2025-10-07 13:03:17.815251116 +0000 UTC m=+149.975973793" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.817217 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" event={"ID":"26caac7f-4892-4897-9ae4-0b8aab4c0bc8","Type":"ContainerStarted","Data":"4024dc994e7d57af7c0625057021ce0ed9cd2a84b9c8f2f37d612ce4bd55d062"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.823274 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" event={"ID":"85df292a-1000-48f0-be15-823ada38a57b","Type":"ContainerStarted","Data":"66040e68dddc6727095941060674080531c4f1b75456c420e9a99776e95582e4"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.832158 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8b86t" podStartSLOduration=128.832134826 podStartE2EDuration="2m8.832134826s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.818227065 +0000 UTC m=+149.978949742" watchObservedRunningTime="2025-10-07 13:03:17.832134826 +0000 UTC m=+149.992857503" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.849760 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.850263 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" event={"ID":"879a5223-92dc-4b9e-9749-15fe196c09dd","Type":"ContainerStarted","Data":"fd8070e01c92a3dc032f078907126bd6c452f8bdb719eeeb2905c99a9e697b91"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.864572 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" event={"ID":"78d2cce2-47b1-419c-92f4-3c83d499804c","Type":"ContainerStarted","Data":"ea0c4b5e1cb0e2280a0549ead89713285fce2950edde7fb92bfb7f2a15ebed75"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.876286 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cbbs" podStartSLOduration=128.876222648 podStartE2EDuration="2m8.876222648s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:17.873050013 +0000 UTC m=+150.033772680" watchObservedRunningTime="2025-10-07 13:03:17.876222648 +0000 UTC m=+150.036945325" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.893483 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" event={"ID":"d0203e72-df97-4a97-8f45-65175f7d9839","Type":"ContainerStarted","Data":"8576f08dd75aa70f81e7d225c08594cbdf6d91a26e4bcdb4ea0ee20fb18cb80d"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.897891 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n8hv8" event={"ID":"8e905bc0-b070-4b86-8122-69922d469d8f","Type":"ContainerStarted","Data":"c0a090baff63bb4de921aafa821fc70b9f41be0632a5d0bcc54fc36170dfb682"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.899155 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" event={"ID":"db426b25-ee7d-4e32-bee8-5ca494b37c06","Type":"ContainerStarted","Data":"bbc5caa034514ad83f0cd0a2ded8d39cc55314bfda618753d10c658a4146927c"} Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.899689 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f29540_afe4_44a5_af16_a86cb8d700da.slice/crio-6bb6176bdb0e11b5fd531ead8180be512ffdc9259df196ee231e7c542f388be5 WatchSource:0}: Error finding container 6bb6176bdb0e11b5fd531ead8180be512ffdc9259df196ee231e7c542f388be5: Status 404 returned error can't find the container with id 6bb6176bdb0e11b5fd531ead8180be512ffdc9259df196ee231e7c542f388be5 Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.902830 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74633d3d_6855_4249_a997_ee82ea68771b.slice/crio-8aa91f3d5b15b6bf566163dc79c4158496321624ea8a7d61074bdb6488399bc1 WatchSource:0}: Error finding container 8aa91f3d5b15b6bf566163dc79c4158496321624ea8a7d61074bdb6488399bc1: Status 404 returned error can't find the container with id 8aa91f3d5b15b6bf566163dc79c4158496321624ea8a7d61074bdb6488399bc1 Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.903100 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.903181 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qg97s"] Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.903491 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.403475392 +0000 UTC m=+150.564198059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.911106 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.911304 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.911385 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:17 crc kubenswrapper[4959]: E1007 13:03:17.911777 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.411761756 +0000 UTC m=+150.572484443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.912154 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" event={"ID":"f8122128-1530-410d-a26b-068922cea39b","Type":"ContainerStarted","Data":"7738ff4b60673ad285f741ebb2e9e0b697823785d13273b7ed0fdc775a9d9851"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.917701 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.918043 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.920577 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" event={"ID":"4d682e57-80e9-495e-b81b-d49e5ffda0f7","Type":"ContainerStarted","Data":"59bba66589b89d6e126e5120bef23ca34b86d13505cf3b069ca00d91455ffe9f"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.933567 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" event={"ID":"f6cc7027-7058-494d-bebf-7f32ae551c27","Type":"ContainerStarted","Data":"a8a7972a6b72a2152fde0b16354dfa8d94050559153311c6ec346a796811eee7"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.940027 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" event={"ID":"e8c54507-e974-490a-97d8-601f70886942","Type":"ContainerStarted","Data":"b104ad6ddaa949f7896b80a7f0db555dc647258cbe0be20fe480b5b19fe269d5"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.970993 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" event={"ID":"4c4b9607-9bba-4c4d-ab60-a3e14de17949","Type":"ContainerStarted","Data":"1eb434fc942dc0af47dcdab6dd838120470c0ee25279afb493f03abee0e130d7"} Oct 07 13:03:17 crc kubenswrapper[4959]: W1007 13:03:17.972959 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a416b28_ad73_4229_ab18_f467e457330c.slice/crio-efd804c54532b7df6c8872d93a6a241668decdecc41741688e300487a9a06a26 WatchSource:0}: Error finding container efd804c54532b7df6c8872d93a6a241668decdecc41741688e300487a9a06a26: Status 404 returned error can't find the container with id efd804c54532b7df6c8872d93a6a241668decdecc41741688e300487a9a06a26 Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.978752 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-np26j" event={"ID":"893a583d-ded3-4c17-b16c-b8f0e18ace91","Type":"ContainerStarted","Data":"43f8153ee99eefd6547736ffd0da239086e31c16a3158efaff12d74a98e20530"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.979174 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.979819 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" event={"ID":"235fc95a-b9be-4d4b-82af-87213702f88d","Type":"ContainerStarted","Data":"244be1183efecc5c83850a4e8f7ab3a50df6f6faeba8b7d8f83e492da7b00d31"} Oct 07 13:03:17 crc kubenswrapper[4959]: I1007 13:03:17.988713 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" event={"ID":"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15","Type":"ContainerStarted","Data":"c2667754270a81c24fdb0ae9fe9f5509586e71ef0fe5e6584601f247e0adadf8"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.001079 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.001155 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.011575 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" event={"ID":"d810e619-3771-414f-a7f6-87ab4f186478","Type":"ContainerStarted","Data":"4d0c45a7c306dfadbaa7b0d1200112984041ae434d3f8e2fc1bf44dbdfa8ba90"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.011902 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.012441 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.512417594 +0000 UTC m=+150.673140331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.012734 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.013129 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.513119788 +0000 UTC m=+150.673842465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.015932 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" event={"ID":"6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e","Type":"ContainerStarted","Data":"ac1a184c28cf30063a4797867b26306a472218993c9d6f66f5548e6e92b5259e"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.017971 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.020914 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" event={"ID":"25a345f3-48cb-42f4-945f-1c373095e97b","Type":"ContainerStarted","Data":"7056ad6678e00a040ecc51b7c0a0f0e33dd0537642ed1eed43982d360a2598c5"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.027214 4959 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4wmmh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.027262 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" podUID="6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.028525 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" event={"ID":"ea769db2-2681-4d3a-9968-ec18475c4690","Type":"ContainerStarted","Data":"bfd35b5c0fc7d8cb74e8e9348ce04211f05aca4ad08f85df991339ae3eff4e35"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.035794 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" event={"ID":"5ed6f47e-1445-40fb-a469-690dc49e5974","Type":"ContainerStarted","Data":"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.036200 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.045454 4959 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gct22 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.045651 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.046690 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" event={"ID":"2277b5ae-22d7-4c52-8660-40b4c4a382b6","Type":"ContainerStarted","Data":"3369a8c3e999fa63180c92fbd6b7e9400c1a5ec52f14c0bd844f1ac87b0bbc33"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.049700 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" event={"ID":"7136b871-7eea-4199-8712-643539681f53","Type":"ContainerStarted","Data":"d402ac9e6b518aab8ecf377097af24651af4bc99cafc781a1da3c9a28e7ba0fc"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.052895 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hszvw" event={"ID":"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5","Type":"ContainerStarted","Data":"0eb19e80b20288ebf6a58ae34b24dab7e145fcdf80b03a87e97f52822330b66a"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.055688 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" event={"ID":"2a82e244-d9ca-4b9f-b309-6a795b1385fd","Type":"ContainerStarted","Data":"b492cd510c3c630284ef5bb23f60082ebbbc86f47a85ae734e991c32da6d9a98"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.058199 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" event={"ID":"aba4605a-0c79-443a-a519-6555a215e3aa","Type":"ContainerStarted","Data":"1dc9a6eea801f86e2c38961a02ae26c0b7cbcc85374ba3b48a59ea07e2450fd5"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.060850 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" event={"ID":"473f716e-b637-46dd-a9aa-c6fd19b2c12d","Type":"ContainerStarted","Data":"fe9a4a0fbc46fb58c29177a169bd32add2a5cb0353861b42599036d01b9b420c"} Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.061780 4959 patch_prober.go:28] interesting pod/console-operator-58897d9998-b48pv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.061817 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b48pv" podUID="cb768738-0e4a-41f4-ba72-2282a201fa5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.114620 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.114778 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.614744948 +0000 UTC m=+150.775467625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.115141 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.117923 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.617906573 +0000 UTC m=+150.778629360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.131326 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.131368 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.131719 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" podStartSLOduration=129.13170694 podStartE2EDuration="2m9.13170694s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.118803022 +0000 UTC m=+150.279525729" watchObservedRunningTime="2025-10-07 13:03:18.13170694 +0000 UTC m=+150.292429607" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.137265 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.217473 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.217678 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.71765121 +0000 UTC m=+150.878373887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.217796 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.218102 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.718088925 +0000 UTC m=+150.878811592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.320823 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.320968 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.820944066 +0000 UTC m=+150.981666743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.321118 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.321374 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.8213665 +0000 UTC m=+150.982089177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.364412 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" podStartSLOduration=129.364395787 podStartE2EDuration="2m9.364395787s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.323579913 +0000 UTC m=+150.484302600" watchObservedRunningTime="2025-10-07 13:03:18.364395787 +0000 UTC m=+150.525118464" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.365743 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" podStartSLOduration=128.365736611 podStartE2EDuration="2m8.365736611s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.361712438 +0000 UTC m=+150.522435135" watchObservedRunningTime="2025-10-07 13:03:18.365736611 +0000 UTC m=+150.526459288" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.422549 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.422801 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.922770382 +0000 UTC m=+151.083493059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.423422 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.423930 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:18.92391601 +0000 UTC m=+151.084638687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.455562 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-np26j" podStartSLOduration=129.455539159 podStartE2EDuration="2m9.455539159s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.454086401 +0000 UTC m=+150.614809078" watchObservedRunningTime="2025-10-07 13:03:18.455539159 +0000 UTC m=+150.616261836" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.490296 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nnqm8" podStartSLOduration=129.490272491 podStartE2EDuration="2m9.490272491s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.487896972 +0000 UTC m=+150.648619649" watchObservedRunningTime="2025-10-07 13:03:18.490272491 +0000 UTC m=+150.650995168" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.507767 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dm8j2" podStartSLOduration=129.507751071 podStartE2EDuration="2m9.507751071s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.506687985 +0000 UTC m=+150.667410672" watchObservedRunningTime="2025-10-07 13:03:18.507751071 +0000 UTC m=+150.668473748" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.524803 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.525154 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.025119616 +0000 UTC m=+151.185842293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.625792 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.626164 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.626506 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.626933 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.126912711 +0000 UTC m=+151.287635388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.627358 4959 patch_prober.go:28] interesting pod/apiserver-76f77b778f-b99bm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.627399 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" podUID="e7ea9371-6a1a-4aac-8361-f9f68dcdc194" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.729501 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.729964 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.229943348 +0000 UTC m=+151.390666025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.832611 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.844784 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.344756775 +0000 UTC m=+151.505479452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.899539 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwbtb" podStartSLOduration=129.899488781 podStartE2EDuration="2m9.899488781s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:18.561905406 +0000 UTC m=+150.722628083" watchObservedRunningTime="2025-10-07 13:03:18.899488781 +0000 UTC m=+151.060211458" Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.946444 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.946685 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.446661015 +0000 UTC m=+151.607383692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:18 crc kubenswrapper[4959]: I1007 13:03:18.947083 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:18 crc kubenswrapper[4959]: E1007 13:03:18.947411 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.447400039 +0000 UTC m=+151.608122716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.067974 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.068767 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.568742773 +0000 UTC m=+151.729465450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.076981 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" event={"ID":"2277b5ae-22d7-4c52-8660-40b4c4a382b6","Type":"ContainerStarted","Data":"22aa77366f82180984de7895cdf471db9878ee9586f7c93025ae863c765a6369"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.085125 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" event={"ID":"f8122128-1530-410d-a26b-068922cea39b","Type":"ContainerStarted","Data":"520f1fbebe3b0ab21ad3f8970dcdafe993da1084d68d090c67d183684087c058"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.118258 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" event={"ID":"85df292a-1000-48f0-be15-823ada38a57b","Type":"ContainerStarted","Data":"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.119587 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.128560 4959 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z65mr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.128607 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.156597 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" event={"ID":"d0203e72-df97-4a97-8f45-65175f7d9839","Type":"ContainerStarted","Data":"80cc680bb9d08921d60d2145c6923ff8d8194f6036854320fa97dd7a0a0ca51a"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.162887 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" event={"ID":"5b487cdd-8a08-4621-9259-567d66d5cc06","Type":"ContainerStarted","Data":"5c0dcbf579fcee7ca6ce5802df71c293018c7dfd113891f3259fdc3055f02d12"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.172086 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.174238 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.674220141 +0000 UTC m=+151.834942818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.188316 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" event={"ID":"d3f29540-afe4-44a5-af16-a86cb8d700da","Type":"ContainerStarted","Data":"6bb6176bdb0e11b5fd531ead8180be512ffdc9259df196ee231e7c542f388be5"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.207013 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jn6sf" event={"ID":"2f920e09-08a8-49c4-b217-c53a126eb3bf","Type":"ContainerStarted","Data":"e0c3bd6c5dd382618eb1530a8d0da9788ae8eab6d3fcd740a99b1bc30825b781"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.210685 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" podStartSLOduration=130.210652429 podStartE2EDuration="2m10.210652429s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.20886148 +0000 UTC m=+151.369584157" watchObservedRunningTime="2025-10-07 13:03:19.210652429 +0000 UTC m=+151.371375106" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.233540 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n8hv8" event={"ID":"8e905bc0-b070-4b86-8122-69922d469d8f","Type":"ContainerStarted","Data":"b2e743c162b790fb0f602be70496c9b9702c87bbab34ffdc488130afc9b2e5e4"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.245995 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" event={"ID":"74633d3d-6855-4249-a997-ee82ea68771b","Type":"ContainerStarted","Data":"8aa91f3d5b15b6bf566163dc79c4158496321624ea8a7d61074bdb6488399bc1"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.266332 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" event={"ID":"747a40c2-ac24-4a4d-b444-437ea791dbe1","Type":"ContainerStarted","Data":"1ebbd5e0a10d164310b0fe534783b90f53ac2787aee502203ed4975447dc7140"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.273386 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.274820 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.774804777 +0000 UTC m=+151.935527454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.282848 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-c5bnk" podStartSLOduration=130.282823053 podStartE2EDuration="2m10.282823053s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.272321775 +0000 UTC m=+151.433044452" watchObservedRunningTime="2025-10-07 13:03:19.282823053 +0000 UTC m=+151.443545730" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.356473 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w4dpf" podStartSLOduration=130.356456315 podStartE2EDuration="2m10.356456315s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.356046351 +0000 UTC m=+151.516769028" watchObservedRunningTime="2025-10-07 13:03:19.356456315 +0000 UTC m=+151.517178992" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.378213 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.379455 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.879434567 +0000 UTC m=+152.040157334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.387108 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" event={"ID":"22041b2f-8d5a-4193-98da-808ee1dc9777","Type":"ContainerStarted","Data":"74a18517a12488e0024e0ad8df7ec82a19c3c924340f585b71312f1ccbf6a7c6"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.415722 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" event={"ID":"7136b871-7eea-4199-8712-643539681f53","Type":"ContainerStarted","Data":"2cbd00fea28d3571aa02d66433bdb30a57984c7859c6ea03ea355d398144d8a8"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.435565 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n8hv8" podStartSLOduration=6.435547367 podStartE2EDuration="6.435547367s" podCreationTimestamp="2025-10-07 13:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.40125594 +0000 UTC m=+151.561978617" watchObservedRunningTime="2025-10-07 13:03:19.435547367 +0000 UTC m=+151.596270034" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.437321 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vxfg9" podStartSLOduration=130.437312086 podStartE2EDuration="2m10.437312086s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.432442464 +0000 UTC m=+151.593165141" watchObservedRunningTime="2025-10-07 13:03:19.437312086 +0000 UTC m=+151.598034763" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.442028 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qg97s" event={"ID":"6a416b28-ad73-4229-ab18-f467e457330c","Type":"ContainerStarted","Data":"efd804c54532b7df6c8872d93a6a241668decdecc41741688e300487a9a06a26"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.468471 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jn6sf" podStartSLOduration=130.468450628 podStartE2EDuration="2m10.468450628s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.458515349 +0000 UTC m=+151.619238056" watchObservedRunningTime="2025-10-07 13:03:19.468450628 +0000 UTC m=+151.629173305" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.475681 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" event={"ID":"db426b25-ee7d-4e32-bee8-5ca494b37c06","Type":"ContainerStarted","Data":"146b82719d036296decb01dffed8a5f1661d71ae322778237dad767ae5fb1cdc"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.479022 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.479332 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:19.979318809 +0000 UTC m=+152.140041476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.493905 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" event={"ID":"d810e619-3771-414f-a7f6-87ab4f186478","Type":"ContainerStarted","Data":"ad90ab9b278cc7de1248b1f2de689f0cf80f13e077699970191a1d1e9b6dffca"} Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.494920 4959 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4wmmh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.494968 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" podUID="6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.495137 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.495193 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.495744 4959 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gct22 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.495805 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.539676 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hmdw2" podStartSLOduration=129.539598998 podStartE2EDuration="2m9.539598998s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.537115336 +0000 UTC m=+151.697838013" watchObservedRunningTime="2025-10-07 13:03:19.539598998 +0000 UTC m=+151.700321675" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.588291 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k2c52" podStartSLOduration=130.588267232 podStartE2EDuration="2m10.588267232s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:19.58821587 +0000 UTC m=+151.748938547" watchObservedRunningTime="2025-10-07 13:03:19.588267232 +0000 UTC m=+151.748989909" Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.601397 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.604518 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.10449644 +0000 UTC m=+152.265219117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.702857 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.707063 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.20702418 +0000 UTC m=+152.367746867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.712807 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.717985 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.217930652 +0000 UTC m=+152.378653329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.813832 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.814334 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.314311078 +0000 UTC m=+152.475033745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:19 crc kubenswrapper[4959]: I1007 13:03:19.916809 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:19 crc kubenswrapper[4959]: E1007 13:03:19.917548 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.417528821 +0000 UTC m=+152.578251498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.018611 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.018834 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.518773498 +0000 UTC m=+152.679496175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.018936 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.019260 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.519247514 +0000 UTC m=+152.679970181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.121305 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.121705 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.62166524 +0000 UTC m=+152.782387917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.121866 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.122251 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.622238179 +0000 UTC m=+152.782960856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.184051 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.186901 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.186968 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.223884 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.224171 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.724121288 +0000 UTC m=+152.884843975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.224320 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.224878 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.724861053 +0000 UTC m=+152.885583730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.325592 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.325872 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.825844261 +0000 UTC m=+152.986566938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.326236 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.326963 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.826949328 +0000 UTC m=+152.987672005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.427676 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.427950 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.927904176 +0000 UTC m=+153.088626853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.428083 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.428489 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:20.928456014 +0000 UTC m=+153.089178691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.500506 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a4b362aa72da7ef576c5aea0ecdd8525139f1d00d017df23a4a482a9af6a4005"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.500558 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1a977302a712acc9dff355ac5c9d33877ade4299b06d834921c556fb58f8baf4"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.502484 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" event={"ID":"78d2cce2-47b1-419c-92f4-3c83d499804c","Type":"ContainerStarted","Data":"f9758c27fa92b8fc667e7367ab231a14d9f72a862d6660e49e092b3771a13ef6"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.504512 4959 generic.go:334] "Generic (PLEG): container finished" podID="2277b5ae-22d7-4c52-8660-40b4c4a382b6" containerID="22aa77366f82180984de7895cdf471db9878ee9586f7c93025ae863c765a6369" exitCode=0 Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.504586 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" event={"ID":"2277b5ae-22d7-4c52-8660-40b4c4a382b6","Type":"ContainerDied","Data":"22aa77366f82180984de7895cdf471db9878ee9586f7c93025ae863c765a6369"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.505982 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" event={"ID":"473f716e-b637-46dd-a9aa-c6fd19b2c12d","Type":"ContainerStarted","Data":"eb31687548a4fe57083906cefdade947bb568005f0acb65984d000ac594154dd"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.506219 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.508303 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" event={"ID":"f6cc7027-7058-494d-bebf-7f32ae551c27","Type":"ContainerStarted","Data":"8083054d5c4045efd38b3c11334c26f0bbdebbe2eb93d6390e91f0699b74cb18"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.508887 4959 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t7nqt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.509022 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" podUID="473f716e-b637-46dd-a9aa-c6fd19b2c12d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.519438 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" event={"ID":"74633d3d-6855-4249-a997-ee82ea68771b","Type":"ContainerStarted","Data":"d88b7cabf79074f3485f2b80ca0acbe6670f5f398b18fcb30376d74ce9804a3c"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.519519 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" event={"ID":"74633d3d-6855-4249-a997-ee82ea68771b","Type":"ContainerStarted","Data":"24be67305ccbef951bcc71121c5027fcf3cd46d745c8a51adc2ae9c47039099f"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.533570 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" event={"ID":"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15","Type":"ContainerStarted","Data":"6ee157f837e0966ded08335669dbfbc41ca82ba936c44d86a2e98d4a5411f665"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.533896 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" event={"ID":"af1dcec3-3b26-4c6c-ac3f-26acfb5ffb15","Type":"ContainerStarted","Data":"652b3c6e930038ad1b4160bc2e0473888ea4d6962f6cbf77c86a6de4f054fbfd"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.534955 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.539611 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dade8c5d642b49b6659e3bed3526496258578d5f31f95ca62921e77deba31d3a"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.539870 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"82afde4871b61581f53ab390a882a9414dacba86c84fd91b8133eb8453050b6e"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.540714 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.541289 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.041260275 +0000 UTC m=+153.201982952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.541733 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.545184 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.045159194 +0000 UTC m=+153.205881871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.570099 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" event={"ID":"235fc95a-b9be-4d4b-82af-87213702f88d","Type":"ContainerStarted","Data":"1ddbcad51ef691174f0eb7a3de8862cbc0c413d322febaddf8f99461cc6a6131"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.590772 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" event={"ID":"7136b871-7eea-4199-8712-643539681f53","Type":"ContainerStarted","Data":"7449d3aa985d6d160ef0910816c34f90ca0d9582862142be7dbf46ef3946ee7d"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.600090 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qg97s" event={"ID":"6a416b28-ad73-4229-ab18-f467e457330c","Type":"ContainerStarted","Data":"e93d4a15e8e05030e59fcc70f45a0dafecf8e1d2d2180c3fcd0763d0d42ddabf"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.609366 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hszvw" event={"ID":"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5","Type":"ContainerStarted","Data":"0350f7bf59f94265da1a926686daca277822310c614e45f143032ba8ae30963b"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.609415 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hszvw" event={"ID":"becaf5d5-ad74-4d9e-9b12-fc9399c41fa5","Type":"ContainerStarted","Data":"33457db306413bc14ad47b380e280bcd82d9aec1d4aa9a8ebdf6eb87814263c6"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.610215 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.631178 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ef60650074802598d788422d821ef8cc76b2e25e37d9578b3a1a53cf7d3d41ae"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.631231 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8aaf3335746930d208880cc59f0abfc0d4833c7d26d0de510863c1c566ddbd91"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.631752 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.635396 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" event={"ID":"4c4b9607-9bba-4c4d-ab60-a3e14de17949","Type":"ContainerStarted","Data":"d12920a2ea6f7fea6e54c28aeb2f7950e6937a63f3382669203c48550b19b6b3"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.638764 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" podStartSLOduration=131.638751238 podStartE2EDuration="2m11.638751238s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.63730704 +0000 UTC m=+152.798029717" watchObservedRunningTime="2025-10-07 13:03:20.638751238 +0000 UTC m=+152.799473915" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.639010 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5hgz" podStartSLOduration=131.639003046 podStartE2EDuration="2m11.639003046s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.591023715 +0000 UTC m=+152.751746412" watchObservedRunningTime="2025-10-07 13:03:20.639003046 +0000 UTC m=+152.799725723" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.643401 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.643775 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.143738703 +0000 UTC m=+153.304461540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.644324 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.646466 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.146455774 +0000 UTC m=+153.307178661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.648349 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" event={"ID":"25a345f3-48cb-42f4-945f-1c373095e97b","Type":"ContainerStarted","Data":"99b3b68818168804abc166a81cf3af39862fee93dfaf6352a286b954c5e3c640"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.648414 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" event={"ID":"25a345f3-48cb-42f4-945f-1c373095e97b","Type":"ContainerStarted","Data":"2a47a8c20a6cce110847d5c0519085ec58113cda11735402a2c1731c654b6573"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.667946 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" event={"ID":"879a5223-92dc-4b9e-9749-15fe196c09dd","Type":"ContainerStarted","Data":"b6db2f41ef3e7b863a4f36661f8262fc68228b1482aa2b63eeffffa7ea7dc59a"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.668011 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" event={"ID":"879a5223-92dc-4b9e-9749-15fe196c09dd","Type":"ContainerStarted","Data":"9227e8265c9145385fb408571cf043828a8ac702261696dcf574e52bfe914837"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.683670 4959 generic.go:334] "Generic (PLEG): container finished" podID="d3f29540-afe4-44a5-af16-a86cb8d700da" containerID="472f8ed8ac9fda62e046e8adc69f77e8449dffb78d498ea2712c5ea608a2be7c" exitCode=0 Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.684077 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" event={"ID":"d3f29540-afe4-44a5-af16-a86cb8d700da","Type":"ContainerDied","Data":"472f8ed8ac9fda62e046e8adc69f77e8449dffb78d498ea2712c5ea608a2be7c"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.689042 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" event={"ID":"26caac7f-4892-4897-9ae4-0b8aab4c0bc8","Type":"ContainerStarted","Data":"ce5ce72192fe52e67e6ba9b41f140e29dd4e13dd55fbaab82d1b29b327801164"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.689755 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.698979 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" event={"ID":"ea769db2-2681-4d3a-9968-ec18475c4690","Type":"ContainerStarted","Data":"24abef28fef489328ffc97911cf5df7b1181dec6726d0f5e9c6a3306d38aecb0"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.699030 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" event={"ID":"ea769db2-2681-4d3a-9968-ec18475c4690","Type":"ContainerStarted","Data":"44935c392fc0e6964694f086312d3cc28534397f31ef5c8d0716c7a19873af79"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.701509 4959 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sks8d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.701550 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" podUID="26caac7f-4892-4897-9ae4-0b8aab4c0bc8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.706175 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" event={"ID":"e8c54507-e974-490a-97d8-601f70886942","Type":"ContainerStarted","Data":"606b6253875e874cdca26c3518dba83a17d0815b404b03b45b3f8fc09c0f82a6"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.729016 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" event={"ID":"5b487cdd-8a08-4621-9259-567d66d5cc06","Type":"ContainerStarted","Data":"8457f7e55a2f6fd8e487dda8cc7d95501d75f8ae2d3da336170d1cd0fc638bb7"} Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.731666 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.731877 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.731821 4959 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z65mr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.732074 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.744191 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhstz" podStartSLOduration=131.744171494 podStartE2EDuration="2m11.744171494s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.738605719 +0000 UTC m=+152.899328416" watchObservedRunningTime="2025-10-07 13:03:20.744171494 +0000 UTC m=+152.904894171" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.746093 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.747847 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.247824575 +0000 UTC m=+153.408547252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.777874 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t74ts" podStartSLOduration=131.777851541 podStartE2EDuration="2m11.777851541s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.775052228 +0000 UTC m=+152.935774915" watchObservedRunningTime="2025-10-07 13:03:20.777851541 +0000 UTC m=+152.938574218" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.834833 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" podStartSLOduration=131.83481593 podStartE2EDuration="2m11.83481593s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.833948781 +0000 UTC m=+152.994671468" watchObservedRunningTime="2025-10-07 13:03:20.83481593 +0000 UTC m=+152.995538607" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.849090 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.860699 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.360656357 +0000 UTC m=+153.521379234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.906822 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wvxrf" podStartSLOduration=131.906807357 podStartE2EDuration="2m11.906807357s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:20.90569449 +0000 UTC m=+153.066417167" watchObservedRunningTime="2025-10-07 13:03:20.906807357 +0000 UTC m=+153.067530024" Oct 07 13:03:20 crc kubenswrapper[4959]: I1007 13:03:20.950524 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:20 crc kubenswrapper[4959]: E1007 13:03:20.951555 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.4515229 +0000 UTC m=+153.612245577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.039388 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hszvw" podStartSLOduration=8.039362893 podStartE2EDuration="8.039362893s" podCreationTimestamp="2025-10-07 13:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.00611425 +0000 UTC m=+153.166836947" watchObservedRunningTime="2025-10-07 13:03:21.039362893 +0000 UTC m=+153.200085570" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.041716 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ptjbf" podStartSLOduration=132.041703831 podStartE2EDuration="2m12.041703831s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.038902088 +0000 UTC m=+153.199624775" watchObservedRunningTime="2025-10-07 13:03:21.041703831 +0000 UTC m=+153.202426508" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.053286 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.053750 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.55373228 +0000 UTC m=+153.714454957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.118596 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" podStartSLOduration=132.118554429 podStartE2EDuration="2m12.118554429s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.090025233 +0000 UTC m=+153.250747920" watchObservedRunningTime="2025-10-07 13:03:21.118554429 +0000 UTC m=+153.279277126" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.154254 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.154587 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.654566744 +0000 UTC m=+153.815289421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.155033 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.155318 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.655311398 +0000 UTC m=+153.816034075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.184105 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-f6pcc" podStartSLOduration=131.184084842 podStartE2EDuration="2m11.184084842s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.130173905 +0000 UTC m=+153.290896602" watchObservedRunningTime="2025-10-07 13:03:21.184084842 +0000 UTC m=+153.344807519" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.196370 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:21 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:21 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:21 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.196872 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.252736 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6stgg" podStartSLOduration=132.252718678 podStartE2EDuration="2m12.252718678s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.240086279 +0000 UTC m=+153.400808966" watchObservedRunningTime="2025-10-07 13:03:21.252718678 +0000 UTC m=+153.413441365" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.254493 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qg97s" podStartSLOduration=8.254488277 podStartE2EDuration="8.254488277s" podCreationTimestamp="2025-10-07 13:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.186472412 +0000 UTC m=+153.347195089" watchObservedRunningTime="2025-10-07 13:03:21.254488277 +0000 UTC m=+153.415210954" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.257515 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.257922 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.75789733 +0000 UTC m=+153.918620007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.359635 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.359952 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.859939424 +0000 UTC m=+154.020662101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.413800 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" podStartSLOduration=131.413775389 podStartE2EDuration="2m11.413775389s" podCreationTimestamp="2025-10-07 13:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.310309518 +0000 UTC m=+153.471032185" watchObservedRunningTime="2025-10-07 13:03:21.413775389 +0000 UTC m=+153.574498066" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.461040 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.461345 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:21.961329876 +0000 UTC m=+154.122052553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.562636 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.563074 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.06305116 +0000 UTC m=+154.223773837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.577986 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9rgh8" podStartSLOduration=132.577963014 podStartE2EDuration="2m12.577963014s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.572603096 +0000 UTC m=+153.733325783" watchObservedRunningTime="2025-10-07 13:03:21.577963014 +0000 UTC m=+153.738685691" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.579015 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-47hzk" podStartSLOduration=132.57901016900001 podStartE2EDuration="2m12.579010169s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.540534743 +0000 UTC m=+153.701257430" watchObservedRunningTime="2025-10-07 13:03:21.579010169 +0000 UTC m=+153.739732846" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.663908 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.664408 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.16438268 +0000 UTC m=+154.325105357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.669177 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8l4sz" podStartSLOduration=132.669151258 podStartE2EDuration="2m12.669151258s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.667189083 +0000 UTC m=+153.827911760" watchObservedRunningTime="2025-10-07 13:03:21.669151258 +0000 UTC m=+153.829873935" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.730410 4959 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4wmmh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.730482 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" podUID="6abc1eb3-fdaf-42b1-91e9-4ba2a120f65e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.736607 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" event={"ID":"2277b5ae-22d7-4c52-8660-40b4c4a382b6","Type":"ContainerStarted","Data":"df9a05ec82f39b6d0c0af4ea62e2e018b2cc13c2a13f11e938b67b555fb2d9b7"} Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.736952 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.739309 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" event={"ID":"d3f29540-afe4-44a5-af16-a86cb8d700da","Type":"ContainerStarted","Data":"7d3c5c17c20760cb121b630ff589ec991810d5f08ecc2a664352fc0f32581774"} Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.740424 4959 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sks8d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.740483 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" podUID="26caac7f-4892-4897-9ae4-0b8aab4c0bc8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.741025 4959 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t7nqt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.741518 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" podUID="473f716e-b637-46dd-a9aa-c6fd19b2c12d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.765289 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.765867 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.265841335 +0000 UTC m=+154.426564012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.834306 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" podStartSLOduration=132.834284834 podStartE2EDuration="2m12.834284834s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.833515709 +0000 UTC m=+153.994238396" watchObservedRunningTime="2025-10-07 13:03:21.834284834 +0000 UTC m=+153.995007501" Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.866944 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.868889 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.368870561 +0000 UTC m=+154.529593238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:21 crc kubenswrapper[4959]: I1007 13:03:21.969452 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:21 crc kubenswrapper[4959]: E1007 13:03:21.969802 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.469790218 +0000 UTC m=+154.630512895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.070478 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.070697 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.570671453 +0000 UTC m=+154.731394130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.070808 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.071129 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.571119398 +0000 UTC m=+154.731842075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.171603 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.172117 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.672094636 +0000 UTC m=+154.832817313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.189436 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:22 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:22 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:22 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.189558 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.272880 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.273524 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.773493989 +0000 UTC m=+154.934216666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.374780 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.375184 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.87516774 +0000 UTC m=+155.035890417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.375789 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.376150 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.876143103 +0000 UTC m=+155.036865780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.477832 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.478311 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:22.9782798 +0000 UTC m=+155.139002467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.579951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.580379 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.080362395 +0000 UTC m=+155.241085072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.603013 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.652902 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" podStartSLOduration=133.65287694 podStartE2EDuration="2m13.65287694s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:21.922334854 +0000 UTC m=+154.083057531" watchObservedRunningTime="2025-10-07 13:03:22.65287694 +0000 UTC m=+154.813599617" Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.681757 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.683093 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.183069191 +0000 UTC m=+155.343791878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.746863 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" event={"ID":"22041b2f-8d5a-4193-98da-808ee1dc9777","Type":"ContainerStarted","Data":"f687a311841fb07d639a9103174b72e91d8fb5e704f24210686630b0abe6f77d"} Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.755451 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sks8d" Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.784303 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.785440 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.285421235 +0000 UTC m=+155.446143912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.885935 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.886788 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.386764756 +0000 UTC m=+155.547487433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:22 crc kubenswrapper[4959]: I1007 13:03:22.987792 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:22 crc kubenswrapper[4959]: E1007 13:03:22.988335 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.488314083 +0000 UTC m=+155.649036760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.089608 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.090153 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.59013089 +0000 UTC m=+155.750853557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.156706 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.157636 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.160722 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.163489 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.171724 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.187685 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:23 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:23 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:23 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.187770 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.191812 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.191877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.191938 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.192284 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.692263267 +0000 UTC m=+155.852985944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.293152 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.293409 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.79336804 +0000 UTC m=+155.954090717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.293771 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.293820 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.293863 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.293950 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.294227 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.794216958 +0000 UTC m=+155.954939635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.323135 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.395082 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.395505 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.895480716 +0000 UTC m=+156.056203393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.479290 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.496573 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.497587 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:23.997572051 +0000 UTC m=+156.158294718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.597333 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.597854 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.097831706 +0000 UTC m=+156.258554383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.642199 4959 patch_prober.go:28] interesting pod/apiserver-76f77b778f-b99bm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]log ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]etcd ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/max-in-flight-filter ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 07 13:03:23 crc kubenswrapper[4959]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 07 13:03:23 crc kubenswrapper[4959]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/project.openshift.io-projectcache ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/openshift.io-startinformers ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 07 13:03:23 crc kubenswrapper[4959]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 13:03:23 crc kubenswrapper[4959]: livez check failed Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.642274 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" podUID="e7ea9371-6a1a-4aac-8361-f9f68dcdc194" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.699067 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.699445 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.199414475 +0000 UTC m=+156.360137152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.799575 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.800323 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.300306581 +0000 UTC m=+156.461029258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.854862 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.902504 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:23 crc kubenswrapper[4959]: E1007 13:03:23.904356 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.404339331 +0000 UTC m=+156.565062008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:23 crc kubenswrapper[4959]: I1007 13:03:23.966207 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.004119 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.004353 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.504311666 +0000 UTC m=+156.665034353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.004499 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.004996 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.504977948 +0000 UTC m=+156.665700625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.028276 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.105817 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.107486 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.607436066 +0000 UTC m=+156.768158743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.185546 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:24 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:24 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:24 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.185619 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.189242 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.193559 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.195903 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.207434 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.207583 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.208047 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.708031252 +0000 UTC m=+156.868753929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.212494 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-b48pv" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.308591 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.308937 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.308991 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2nh\" (UniqueName: \"kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.309135 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.310135 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.810106387 +0000 UTC m=+156.970829114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.381597 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.382967 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.388078 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.397224 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.410767 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.410843 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.410871 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2nh\" (UniqueName: \"kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.410914 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.411318 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:24.911300802 +0000 UTC m=+157.072023479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.411598 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.411855 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.435100 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2nh\" (UniqueName: \"kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh\") pod \"community-operators-dfqft\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.511411 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.511537 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.011510306 +0000 UTC m=+157.172232983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.511788 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjth\" (UniqueName: \"kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.511828 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.511874 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.511904 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.512245 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.01223773 +0000 UTC m=+157.172960407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.520605 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.552707 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.553278 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.556037 4959 patch_prober.go:28] interesting pod/console-f9d7485db-j72km container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.556108 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j72km" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.580708 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.582241 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.603509 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.612730 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.613020 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.112988451 +0000 UTC m=+157.273711128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614736 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614777 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtp4\" (UniqueName: \"kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614833 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjth\" (UniqueName: \"kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614884 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614906 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614945 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.614976 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.616472 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.116456996 +0000 UTC m=+157.277179673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.616881 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.616892 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.637601 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjth\" (UniqueName: \"kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth\") pod \"certified-operators-8vtzh\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.704133 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.711698 4959 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.717348 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.717613 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.717728 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.717758 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtp4\" (UniqueName: \"kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.718335 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.218308014 +0000 UTC m=+157.379030691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.718781 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.720650 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.737266 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtp4\" (UniqueName: \"kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4\") pod \"community-operators-qhdcb\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.766796 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:03:24 crc kubenswrapper[4959]: W1007 13:03:24.782089 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod726f45c1_32a5_47f2_9540_d6ab00654be3.slice/crio-126e196d639536ed1a2238822e0fdfe5a696ae17ce18ad18fb4f36ecbb5458ec WatchSource:0}: Error finding container 126e196d639536ed1a2238822e0fdfe5a696ae17ce18ad18fb4f36ecbb5458ec: Status 404 returned error can't find the container with id 126e196d639536ed1a2238822e0fdfe5a696ae17ce18ad18fb4f36ecbb5458ec Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.786787 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.787859 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.792123 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.818949 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82rdc\" (UniqueName: \"kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.818993 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.819051 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.819125 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.819162 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerStarted","Data":"126e196d639536ed1a2238822e0fdfe5a696ae17ce18ad18fb4f36ecbb5458ec"} Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.819207 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ca6eb94a-e1db-4531-beb5-a7d52c10de3f","Type":"ContainerStarted","Data":"2c4ec4f37e361424a7a5e0202fc812e8c49b32cbfcae022c0c0da86d8c2f33c6"} Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.819221 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ca6eb94a-e1db-4531-beb5-a7d52c10de3f","Type":"ContainerStarted","Data":"b9c69701c9815849ed5487da8adbd9c7d852ba8e711e33a0697fa2259b1c7a2e"} Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.819518 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.319486299 +0000 UTC m=+157.480208976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4wfvg" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.822468 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" event={"ID":"22041b2f-8d5a-4193-98da-808ee1dc9777","Type":"ContainerStarted","Data":"a70cafb910336a0950caf521fc7dd8deb75ab04662d6050a100aa54096a02991"} Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.822509 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" event={"ID":"22041b2f-8d5a-4193-98da-808ee1dc9777","Type":"ContainerStarted","Data":"df5fc910f80b6a67e8d0d3b883edc4023594799000cc1bdc4b0ae0d57e778570"} Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.840690 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.840663861 podStartE2EDuration="1.840663861s" podCreationTimestamp="2025-10-07 13:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:24.839757051 +0000 UTC m=+157.000479748" watchObservedRunningTime="2025-10-07 13:03:24.840663861 +0000 UTC m=+157.001386528" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.915798 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.920150 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:24 crc kubenswrapper[4959]: E1007 13:03:24.920506 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:03:25.420479028 +0000 UTC m=+157.581201865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.922738 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.923034 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82rdc\" (UniqueName: \"kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.923056 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.923493 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.923880 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.953880 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82rdc\" (UniqueName: \"kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc\") pod \"certified-operators-6bptn\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.981269 4959 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T13:03:24.711740426Z","Handler":null,"Name":""} Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.984999 4959 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 13:03:24 crc kubenswrapper[4959]: I1007 13:03:24.985040 4959 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.024488 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.031763 4959 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.031803 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.074007 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4wfvg\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.126483 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.133677 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.144680 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.145097 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rz2xc" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.169102 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.186174 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.193036 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:25 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:25 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:25 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.193113 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.250525 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:03:25 crc kubenswrapper[4959]: W1007 13:03:25.254942 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7cf0a35_1af6_4c0c_b8e0_f7a0727071d4.slice/crio-6e131ac6ee423e445928b579fb30112717e6a08f6cefae69b9dba744a0635a28 WatchSource:0}: Error finding container 6e131ac6ee423e445928b579fb30112717e6a08f6cefae69b9dba744a0635a28: Status 404 returned error can't find the container with id 6e131ac6ee423e445928b579fb30112717e6a08f6cefae69b9dba744a0635a28 Oct 07 13:03:25 crc kubenswrapper[4959]: W1007 13:03:25.261169 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b359b0_6002_4de2_912e_e5068f8fe8fe.slice/crio-3380c178882e341fdf633f216bdcb46a1b9213707cad855e57381294395d3272 WatchSource:0}: Error finding container 3380c178882e341fdf633f216bdcb46a1b9213707cad855e57381294395d3272: Status 404 returned error can't find the container with id 3380c178882e341fdf633f216bdcb46a1b9213707cad855e57381294395d3272 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.461709 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.462263 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.462790 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.462984 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.483606 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.503066 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.546300 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:03:25 crc kubenswrapper[4959]: W1007 13:03:25.591696 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31f5e5a_8410_4933_a446_515c58b1e7c3.slice/crio-5c5c9c5a5caa7a13dc28a99d147fb0453d4875690442c7333db90373e94194dc WatchSource:0}: Error finding container 5c5c9c5a5caa7a13dc28a99d147fb0453d4875690442c7333db90373e94194dc: Status 404 returned error can't find the container with id 5c5c9c5a5caa7a13dc28a99d147fb0453d4875690442c7333db90373e94194dc Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.830941 4959 generic.go:334] "Generic (PLEG): container finished" podID="5b487cdd-8a08-4621-9259-567d66d5cc06" containerID="8457f7e55a2f6fd8e487dda8cc7d95501d75f8ae2d3da336170d1cd0fc638bb7" exitCode=0 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.831013 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" event={"ID":"5b487cdd-8a08-4621-9259-567d66d5cc06","Type":"ContainerDied","Data":"8457f7e55a2f6fd8e487dda8cc7d95501d75f8ae2d3da336170d1cd0fc638bb7"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.834936 4959 generic.go:334] "Generic (PLEG): container finished" podID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerID="d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f" exitCode=0 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.834983 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerDied","Data":"d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.835034 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerStarted","Data":"6e131ac6ee423e445928b579fb30112717e6a08f6cefae69b9dba744a0635a28"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.837154 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.839037 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" event={"ID":"22041b2f-8d5a-4193-98da-808ee1dc9777","Type":"ContainerStarted","Data":"d510fa07f008d31490c6f49a5ca59d44dff576dc9131e07757a19cc6bbd8939c"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.841218 4959 generic.go:334] "Generic (PLEG): container finished" podID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerID="a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f" exitCode=0 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.841304 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerDied","Data":"a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.841370 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerStarted","Data":"3380c178882e341fdf633f216bdcb46a1b9213707cad855e57381294395d3272"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.843630 4959 generic.go:334] "Generic (PLEG): container finished" podID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerID="cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0" exitCode=0 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.843742 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerDied","Data":"cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.849570 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerStarted","Data":"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.849691 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerStarted","Data":"5c5c9c5a5caa7a13dc28a99d147fb0453d4875690442c7333db90373e94194dc"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.852265 4959 generic.go:334] "Generic (PLEG): container finished" podID="ca6eb94a-e1db-4531-beb5-a7d52c10de3f" containerID="2c4ec4f37e361424a7a5e0202fc812e8c49b32cbfcae022c0c0da86d8c2f33c6" exitCode=0 Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.852325 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ca6eb94a-e1db-4531-beb5-a7d52c10de3f","Type":"ContainerDied","Data":"2c4ec4f37e361424a7a5e0202fc812e8c49b32cbfcae022c0c0da86d8c2f33c6"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.854273 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" event={"ID":"14600350-80fe-4397-8fd8-02c6139cd9d6","Type":"ContainerStarted","Data":"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.854304 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" event={"ID":"14600350-80fe-4397-8fd8-02c6139cd9d6","Type":"ContainerStarted","Data":"ebd71a1cb9657aaf72e2933e19e2b90677f907ce12e355dca412d4b73ef2a8e6"} Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.854600 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.895304 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dbzd2" podStartSLOduration=12.895277633 podStartE2EDuration="12.895277633s" podCreationTimestamp="2025-10-07 13:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:25.891596411 +0000 UTC m=+158.052319108" watchObservedRunningTime="2025-10-07 13:03:25.895277633 +0000 UTC m=+158.056000310" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.986886 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" podStartSLOduration=136.98686808 podStartE2EDuration="2m16.98686808s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:25.98383745 +0000 UTC m=+158.144560127" watchObservedRunningTime="2025-10-07 13:03:25.98686808 +0000 UTC m=+158.147590757" Oct 07 13:03:25 crc kubenswrapper[4959]: I1007 13:03:25.987477 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4wmmh" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.021853 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.022373 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7nqt" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.111052 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.111170 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.119809 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.182885 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.184971 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.186320 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.188547 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.199087 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:26 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:26 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:26 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.199289 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.214139 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.259322 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.259505 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnjgk\" (UniqueName: \"kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.259540 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.361024 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnjgk\" (UniqueName: \"kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.361104 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.361162 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.361947 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.361975 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.383405 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnjgk\" (UniqueName: \"kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk\") pod \"redhat-marketplace-s57zz\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.502108 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.580270 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.581426 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.616733 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.666434 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7c6\" (UniqueName: \"kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.666494 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.666545 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.745334 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.769181 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7c6\" (UniqueName: \"kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.769240 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.769293 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.769878 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.771134 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.805295 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7c6\" (UniqueName: \"kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6\") pod \"redhat-marketplace-lvndj\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.829251 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.863189 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerStarted","Data":"5a0317a3bb02c0c9f79a0ba2891f61c4eb8563ee7686885e011b0603765c7ea6"} Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.865109 4959 generic.go:334] "Generic (PLEG): container finished" podID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerID="bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb" exitCode=0 Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.865555 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerDied","Data":"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb"} Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.887468 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2vw5" Oct 07 13:03:26 crc kubenswrapper[4959]: I1007 13:03:26.941652 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.101478 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.176703 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access\") pod \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.178276 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir\") pod \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\" (UID: \"ca6eb94a-e1db-4531-beb5-a7d52c10de3f\") " Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.178343 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ca6eb94a-e1db-4531-beb5-a7d52c10de3f" (UID: "ca6eb94a-e1db-4531-beb5-a7d52c10de3f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.178625 4959 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.182056 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ca6eb94a-e1db-4531-beb5-a7d52c10de3f" (UID: "ca6eb94a-e1db-4531-beb5-a7d52c10de3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.187903 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:27 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:27 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:27 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.187984 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.249091 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.280772 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r\") pod \"5b487cdd-8a08-4621-9259-567d66d5cc06\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.280853 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume\") pod \"5b487cdd-8a08-4621-9259-567d66d5cc06\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.280877 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") pod \"5b487cdd-8a08-4621-9259-567d66d5cc06\" (UID: \"5b487cdd-8a08-4621-9259-567d66d5cc06\") " Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.282131 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6eb94a-e1db-4531-beb5-a7d52c10de3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.282799 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume" (OuterVolumeSpecName: "config-volume") pod "5b487cdd-8a08-4621-9259-567d66d5cc06" (UID: "5b487cdd-8a08-4621-9259-567d66d5cc06"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.294240 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r" (OuterVolumeSpecName: "kube-api-access-lt98r") pod "5b487cdd-8a08-4621-9259-567d66d5cc06" (UID: "5b487cdd-8a08-4621-9259-567d66d5cc06"). InnerVolumeSpecName "kube-api-access-lt98r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.299776 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5b487cdd-8a08-4621-9259-567d66d5cc06" (UID: "5b487cdd-8a08-4621-9259-567d66d5cc06"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.301930 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.385488 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/5b487cdd-8a08-4621-9259-567d66d5cc06-kube-api-access-lt98r\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.385519 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b487cdd-8a08-4621-9259-567d66d5cc06-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.385527 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b487cdd-8a08-4621-9259-567d66d5cc06-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.394094 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:03:27 crc kubenswrapper[4959]: E1007 13:03:27.394504 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b487cdd-8a08-4621-9259-567d66d5cc06" containerName="collect-profiles" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.394527 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b487cdd-8a08-4621-9259-567d66d5cc06" containerName="collect-profiles" Oct 07 13:03:27 crc kubenswrapper[4959]: E1007 13:03:27.394535 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6eb94a-e1db-4531-beb5-a7d52c10de3f" containerName="pruner" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.394542 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6eb94a-e1db-4531-beb5-a7d52c10de3f" containerName="pruner" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.394722 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b487cdd-8a08-4621-9259-567d66d5cc06" containerName="collect-profiles" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.394738 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6eb94a-e1db-4531-beb5-a7d52c10de3f" containerName="pruner" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.395920 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.399204 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.403798 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.486798 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2ld\" (UniqueName: \"kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.486877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.486916 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.588446 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2ld\" (UniqueName: \"kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.588506 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.588527 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.589161 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.589191 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.606636 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2ld\" (UniqueName: \"kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld\") pod \"redhat-operators-f9pzw\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.621431 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.622174 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.627131 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.632089 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.636697 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.691188 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.691300 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.740534 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.789904 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.791168 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.793745 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.793829 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.794450 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.797972 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.825173 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.875139 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ca6eb94a-e1db-4531-beb5-a7d52c10de3f","Type":"ContainerDied","Data":"b9c69701c9815849ed5487da8adbd9c7d852ba8e711e33a0697fa2259b1c7a2e"} Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.875179 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c69701c9815849ed5487da8adbd9c7d852ba8e711e33a0697fa2259b1c7a2e" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.875246 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.886347 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.886409 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr" event={"ID":"5b487cdd-8a08-4621-9259-567d66d5cc06","Type":"ContainerDied","Data":"5c0dcbf579fcee7ca6ce5802df71c293018c7dfd113891f3259fdc3055f02d12"} Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.886481 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c0dcbf579fcee7ca6ce5802df71c293018c7dfd113891f3259fdc3055f02d12" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.893839 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerStarted","Data":"508cd8f9aecf950a1cc52e2b2ce2757d9386e9ee85cb513f6bab53ff4815fd28"} Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.893894 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerStarted","Data":"613322e16e3ff235de8201873741969faf5be3eff5c6fd71e1d578074ff73497"} Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.894719 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.894760 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xzph\" (UniqueName: \"kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.894792 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.896857 4959 generic.go:334] "Generic (PLEG): container finished" podID="cd5b4f5b-d699-457d-8363-def61c21613b" containerID="002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f" exitCode=0 Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.898103 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerDied","Data":"002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f"} Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.967161 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:27 crc kubenswrapper[4959]: I1007 13:03:27.985258 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.002962 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xzph\" (UniqueName: \"kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.003029 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.003175 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: W1007 13:03:28.003390 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41be072_ce09_432f_8ea3_b1a0363a74df.slice/crio-a9ef56416378f0c229e3d48532f7c699834cef148761cea250fcff76abde8458 WatchSource:0}: Error finding container a9ef56416378f0c229e3d48532f7c699834cef148761cea250fcff76abde8458: Status 404 returned error can't find the container with id a9ef56416378f0c229e3d48532f7c699834cef148761cea250fcff76abde8458 Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.004603 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.004950 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.026826 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xzph\" (UniqueName: \"kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph\") pod \"redhat-operators-4ch4j\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.111901 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.193247 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:28 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:28 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:28 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.193387 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.318977 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.409160 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:03:28 crc kubenswrapper[4959]: W1007 13:03:28.424224 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod09377877_6bb1_428f_8957_6178c779bfb8.slice/crio-464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8 WatchSource:0}: Error finding container 464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8: Status 404 returned error can't find the container with id 464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8 Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.631801 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.639301 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-b99bm" Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.912305 4959 generic.go:334] "Generic (PLEG): container finished" podID="b8fe2270-9af2-4587-8e8d-33696177645a" containerID="508cd8f9aecf950a1cc52e2b2ce2757d9386e9ee85cb513f6bab53ff4815fd28" exitCode=0 Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.912331 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerDied","Data":"508cd8f9aecf950a1cc52e2b2ce2757d9386e9ee85cb513f6bab53ff4815fd28"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.923487 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09377877-6bb1-428f-8957-6178c779bfb8","Type":"ContainerStarted","Data":"322e55d865fe7ea1300482b7caff5e3056efdae06cf67e8497a1acce2e6cc45e"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.923532 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09377877-6bb1-428f-8957-6178c779bfb8","Type":"ContainerStarted","Data":"464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.927408 4959 generic.go:334] "Generic (PLEG): container finished" podID="68d5824e-afe0-4c5d-99b0-485644e77905" containerID="19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2" exitCode=0 Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.927477 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerDied","Data":"19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.927504 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerStarted","Data":"e645779b47b1fa05076e06676938f9ccec15604489de07f61b618c0ed971382f"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.933568 4959 generic.go:334] "Generic (PLEG): container finished" podID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerID="aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4" exitCode=0 Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.935917 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerDied","Data":"aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.935956 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerStarted","Data":"a9ef56416378f0c229e3d48532f7c699834cef148761cea250fcff76abde8458"} Oct 07 13:03:28 crc kubenswrapper[4959]: I1007 13:03:28.973447 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.9734291210000001 podStartE2EDuration="1.973429121s" podCreationTimestamp="2025-10-07 13:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:03:28.970588747 +0000 UTC m=+161.131311434" watchObservedRunningTime="2025-10-07 13:03:28.973429121 +0000 UTC m=+161.134151798" Oct 07 13:03:29 crc kubenswrapper[4959]: I1007 13:03:29.187873 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:29 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:29 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:29 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:29 crc kubenswrapper[4959]: I1007 13:03:29.187966 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:29 crc kubenswrapper[4959]: I1007 13:03:29.942459 4959 generic.go:334] "Generic (PLEG): container finished" podID="09377877-6bb1-428f-8957-6178c779bfb8" containerID="322e55d865fe7ea1300482b7caff5e3056efdae06cf67e8497a1acce2e6cc45e" exitCode=0 Oct 07 13:03:29 crc kubenswrapper[4959]: I1007 13:03:29.942579 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09377877-6bb1-428f-8957-6178c779bfb8","Type":"ContainerDied","Data":"322e55d865fe7ea1300482b7caff5e3056efdae06cf67e8497a1acce2e6cc45e"} Oct 07 13:03:30 crc kubenswrapper[4959]: I1007 13:03:30.190694 4959 patch_prober.go:28] interesting pod/router-default-5444994796-jn6sf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:03:30 crc kubenswrapper[4959]: [-]has-synced failed: reason withheld Oct 07 13:03:30 crc kubenswrapper[4959]: [+]process-running ok Oct 07 13:03:30 crc kubenswrapper[4959]: healthz check failed Oct 07 13:03:30 crc kubenswrapper[4959]: I1007 13:03:30.190795 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jn6sf" podUID="2f920e09-08a8-49c4-b217-c53a126eb3bf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:03:31 crc kubenswrapper[4959]: I1007 13:03:31.186520 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:31 crc kubenswrapper[4959]: I1007 13:03:31.188954 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jn6sf" Oct 07 13:03:31 crc kubenswrapper[4959]: I1007 13:03:31.636203 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hszvw" Oct 07 13:03:32 crc kubenswrapper[4959]: I1007 13:03:32.483489 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:32 crc kubenswrapper[4959]: I1007 13:03:32.503430 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed03c94e-16fb-42f7-8383-ac7c2c403298-metrics-certs\") pod \"network-metrics-daemon-g57ch\" (UID: \"ed03c94e-16fb-42f7-8383-ac7c2c403298\") " pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:32 crc kubenswrapper[4959]: I1007 13:03:32.728995 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g57ch" Oct 07 13:03:34 crc kubenswrapper[4959]: I1007 13:03:34.552955 4959 patch_prober.go:28] interesting pod/console-f9d7485db-j72km container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 07 13:03:34 crc kubenswrapper[4959]: I1007 13:03:34.553036 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j72km" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 07 13:03:35 crc kubenswrapper[4959]: I1007 13:03:35.461260 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:35 crc kubenswrapper[4959]: I1007 13:03:35.461606 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:35 crc kubenswrapper[4959]: I1007 13:03:35.461259 4959 patch_prober.go:28] interesting pod/downloads-7954f5f757-np26j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 13:03:35 crc kubenswrapper[4959]: I1007 13:03:35.461885 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-np26j" podUID="893a583d-ded3-4c17-b16c-b8f0e18ace91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.616806 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.693813 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access\") pod \"09377877-6bb1-428f-8957-6178c779bfb8\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.693859 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir\") pod \"09377877-6bb1-428f-8957-6178c779bfb8\" (UID: \"09377877-6bb1-428f-8957-6178c779bfb8\") " Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.693953 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "09377877-6bb1-428f-8957-6178c779bfb8" (UID: "09377877-6bb1-428f-8957-6178c779bfb8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.694258 4959 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09377877-6bb1-428f-8957-6178c779bfb8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.695917 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.695971 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.698538 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "09377877-6bb1-428f-8957-6178c779bfb8" (UID: "09377877-6bb1-428f-8957-6178c779bfb8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:03:37 crc kubenswrapper[4959]: I1007 13:03:37.795381 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09377877-6bb1-428f-8957-6178c779bfb8-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:03:38 crc kubenswrapper[4959]: I1007 13:03:38.038227 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09377877-6bb1-428f-8957-6178c779bfb8","Type":"ContainerDied","Data":"464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8"} Oct 07 13:03:38 crc kubenswrapper[4959]: I1007 13:03:38.038279 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464202c615eefffa4c51ba2916a6301024c88e5cbe2c4b3e5af4a4960eeb3bf8" Oct 07 13:03:38 crc kubenswrapper[4959]: I1007 13:03:38.038342 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:03:44 crc kubenswrapper[4959]: I1007 13:03:44.559574 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:44 crc kubenswrapper[4959]: I1007 13:03:44.564995 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:03:45 crc kubenswrapper[4959]: I1007 13:03:45.178202 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:03:45 crc kubenswrapper[4959]: I1007 13:03:45.479175 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-np26j" Oct 07 13:03:56 crc kubenswrapper[4959]: I1007 13:03:56.295279 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vznnm" Oct 07 13:03:58 crc kubenswrapper[4959]: I1007 13:03:58.166875 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:04:07 crc kubenswrapper[4959]: I1007 13:04:07.696156 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:04:07 crc kubenswrapper[4959]: I1007 13:04:07.696572 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:04:10 crc kubenswrapper[4959]: E1007 13:04:10.593320 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage1950956405/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:04:10 crc kubenswrapper[4959]: E1007 13:04:10.593932 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gk7c6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lvndj_openshift-marketplace(b8fe2270-9af2-4587-8e8d-33696177645a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage1950956405/2\": happened during read: context canceled" logger="UnhandledError" Oct 07 13:04:10 crc kubenswrapper[4959]: E1007 13:04:10.595138 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage1950956405/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lvndj" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.024016 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lvndj" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.096225 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.096564 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnjgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-s57zz_openshift-marketplace(cd5b4f5b-d699-457d-8363-def61c21613b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.097891 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-s57zz" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.111368 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.111522 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82rdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6bptn_openshift-marketplace(b31f5e5a-8410-4933-a446-515c58b1e7c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.112825 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6bptn" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.115884 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.116059 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvjth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8vtzh_openshift-marketplace(04b359b0-6002-4de2-912e-e5068f8fe8fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:13 crc kubenswrapper[4959]: E1007 13:04:13.119615 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8vtzh" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.707785 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-s57zz" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.707786 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8vtzh" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.707804 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6bptn" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.781523 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.781720 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz2nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dfqft_openshift-marketplace(726f45c1-32a5-47f2-9540-d6ab00654be3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:14 crc kubenswrapper[4959]: E1007 13:04:14.783246 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dfqft" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.652797 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dfqft" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.678191 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.678673 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xzph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4ch4j_openshift-marketplace(68d5824e-afe0-4c5d-99b0-485644e77905): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.680261 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4ch4j" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.707838 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.708015 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qx2ld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f9pzw_openshift-marketplace(c41be072-ce09-432f-8ea3-b1a0363a74df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.709172 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f9pzw" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.719384 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.719560 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6dtp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qhdcb_openshift-marketplace(e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:04:17 crc kubenswrapper[4959]: E1007 13:04:17.721572 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qhdcb" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" Oct 07 13:04:18 crc kubenswrapper[4959]: I1007 13:04:18.063887 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g57ch"] Oct 07 13:04:18 crc kubenswrapper[4959]: I1007 13:04:18.304741 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g57ch" event={"ID":"ed03c94e-16fb-42f7-8383-ac7c2c403298","Type":"ContainerStarted","Data":"f353958eecdba6503c888cd0f5959683a88940e2b61319c763b5d5c80b344023"} Oct 07 13:04:18 crc kubenswrapper[4959]: E1007 13:04:18.306288 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f9pzw" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" Oct 07 13:04:18 crc kubenswrapper[4959]: E1007 13:04:18.306789 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qhdcb" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" Oct 07 13:04:18 crc kubenswrapper[4959]: E1007 13:04:18.307812 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4ch4j" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" Oct 07 13:04:19 crc kubenswrapper[4959]: I1007 13:04:19.313350 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g57ch" event={"ID":"ed03c94e-16fb-42f7-8383-ac7c2c403298","Type":"ContainerStarted","Data":"4d51edb2cfbe4adb74451cc3947f27e146535f301ae099fc725321a183fe6386"} Oct 07 13:04:19 crc kubenswrapper[4959]: I1007 13:04:19.313721 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g57ch" event={"ID":"ed03c94e-16fb-42f7-8383-ac7c2c403298","Type":"ContainerStarted","Data":"92b20d611a149de55344a147b3ff0ef23b125dbff6fab904949756c89123e970"} Oct 07 13:04:19 crc kubenswrapper[4959]: I1007 13:04:19.331509 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g57ch" podStartSLOduration=190.331488028 podStartE2EDuration="3m10.331488028s" podCreationTimestamp="2025-10-07 13:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:04:19.32968716 +0000 UTC m=+211.490409847" watchObservedRunningTime="2025-10-07 13:04:19.331488028 +0000 UTC m=+211.492210715" Oct 07 13:04:26 crc kubenswrapper[4959]: I1007 13:04:26.366762 4959 generic.go:334] "Generic (PLEG): container finished" podID="b8fe2270-9af2-4587-8e8d-33696177645a" containerID="e796324023038f6b7f297f7c7caef457891f0e5fc4d6c797e03fd092338433d4" exitCode=0 Oct 07 13:04:26 crc kubenswrapper[4959]: I1007 13:04:26.367409 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerDied","Data":"e796324023038f6b7f297f7c7caef457891f0e5fc4d6c797e03fd092338433d4"} Oct 07 13:04:27 crc kubenswrapper[4959]: I1007 13:04:27.375244 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerStarted","Data":"8fc9412578ff452ad024780a9063e0685e90f76b79eb5c4a27916a24bd7ece10"} Oct 07 13:04:27 crc kubenswrapper[4959]: I1007 13:04:27.394869 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lvndj" podStartSLOduration=3.452335903 podStartE2EDuration="1m1.394813442s" podCreationTimestamp="2025-10-07 13:03:26 +0000 UTC" firstStartedPulling="2025-10-07 13:03:28.917768425 +0000 UTC m=+161.078491102" lastFinishedPulling="2025-10-07 13:04:26.860245964 +0000 UTC m=+219.020968641" observedRunningTime="2025-10-07 13:04:27.392182378 +0000 UTC m=+219.552905075" watchObservedRunningTime="2025-10-07 13:04:27.394813442 +0000 UTC m=+219.555536119" Oct 07 13:04:30 crc kubenswrapper[4959]: I1007 13:04:30.393925 4959 generic.go:334] "Generic (PLEG): container finished" podID="cd5b4f5b-d699-457d-8363-def61c21613b" containerID="9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e" exitCode=0 Oct 07 13:04:30 crc kubenswrapper[4959]: I1007 13:04:30.393994 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerDied","Data":"9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e"} Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.401445 4959 generic.go:334] "Generic (PLEG): container finished" podID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerID="7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16" exitCode=0 Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.401530 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerDied","Data":"7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16"} Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.410590 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerStarted","Data":"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab"} Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.412463 4959 generic.go:334] "Generic (PLEG): container finished" podID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerID="fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb" exitCode=0 Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.412679 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerDied","Data":"fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb"} Oct 07 13:04:31 crc kubenswrapper[4959]: I1007 13:04:31.471076 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s57zz" podStartSLOduration=2.476272687 podStartE2EDuration="1m5.471057792s" podCreationTimestamp="2025-10-07 13:03:26 +0000 UTC" firstStartedPulling="2025-10-07 13:03:27.90108662 +0000 UTC m=+160.061809327" lastFinishedPulling="2025-10-07 13:04:30.895871755 +0000 UTC m=+223.056594432" observedRunningTime="2025-10-07 13:04:31.468722247 +0000 UTC m=+223.629444924" watchObservedRunningTime="2025-10-07 13:04:31.471057792 +0000 UTC m=+223.631780469" Oct 07 13:04:32 crc kubenswrapper[4959]: I1007 13:04:32.421739 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerStarted","Data":"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4"} Oct 07 13:04:32 crc kubenswrapper[4959]: I1007 13:04:32.424589 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerStarted","Data":"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2"} Oct 07 13:04:32 crc kubenswrapper[4959]: I1007 13:04:32.449579 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8vtzh" podStartSLOduration=2.322575527 podStartE2EDuration="1m8.449556758s" podCreationTimestamp="2025-10-07 13:03:24 +0000 UTC" firstStartedPulling="2025-10-07 13:03:25.843790736 +0000 UTC m=+158.004513413" lastFinishedPulling="2025-10-07 13:04:31.970771967 +0000 UTC m=+224.131494644" observedRunningTime="2025-10-07 13:04:32.446564461 +0000 UTC m=+224.607287148" watchObservedRunningTime="2025-10-07 13:04:32.449556758 +0000 UTC m=+224.610279455" Oct 07 13:04:32 crc kubenswrapper[4959]: I1007 13:04:32.463405 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6bptn" podStartSLOduration=2.233269725 podStartE2EDuration="1m8.463385583s" podCreationTimestamp="2025-10-07 13:03:24 +0000 UTC" firstStartedPulling="2025-10-07 13:03:25.851595585 +0000 UTC m=+158.012318262" lastFinishedPulling="2025-10-07 13:04:32.081711443 +0000 UTC m=+224.242434120" observedRunningTime="2025-10-07 13:04:32.460412227 +0000 UTC m=+224.621134914" watchObservedRunningTime="2025-10-07 13:04:32.463385583 +0000 UTC m=+224.624108280" Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.432262 4959 generic.go:334] "Generic (PLEG): container finished" podID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerID="110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92" exitCode=0 Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.432291 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerDied","Data":"110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92"} Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.435615 4959 generic.go:334] "Generic (PLEG): container finished" podID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerID="e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5" exitCode=0 Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.435698 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerDied","Data":"e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5"} Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.439024 4959 generic.go:334] "Generic (PLEG): container finished" podID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerID="e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5" exitCode=0 Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.439068 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerDied","Data":"e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5"} Oct 07 13:04:33 crc kubenswrapper[4959]: I1007 13:04:33.442971 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerStarted","Data":"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7"} Oct 07 13:04:34 crc kubenswrapper[4959]: I1007 13:04:34.454280 4959 generic.go:334] "Generic (PLEG): container finished" podID="68d5824e-afe0-4c5d-99b0-485644e77905" containerID="93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7" exitCode=0 Oct 07 13:04:34 crc kubenswrapper[4959]: I1007 13:04:34.454740 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerDied","Data":"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7"} Oct 07 13:04:34 crc kubenswrapper[4959]: I1007 13:04:34.705204 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:04:34 crc kubenswrapper[4959]: I1007 13:04:34.705263 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:04:35 crc kubenswrapper[4959]: I1007 13:04:35.133892 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:35 crc kubenswrapper[4959]: I1007 13:04:35.134114 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:35 crc kubenswrapper[4959]: I1007 13:04:35.368014 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:04:35 crc kubenswrapper[4959]: I1007 13:04:35.385838 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:36 crc kubenswrapper[4959]: I1007 13:04:36.502571 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:04:36 crc kubenswrapper[4959]: I1007 13:04:36.503413 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:04:36 crc kubenswrapper[4959]: I1007 13:04:36.565271 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:04:36 crc kubenswrapper[4959]: I1007 13:04:36.942453 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:36 crc kubenswrapper[4959]: I1007 13:04:36.943063 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.008294 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.506245 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.508616 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.695852 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.695912 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.695960 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.696368 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:04:37 crc kubenswrapper[4959]: I1007 13:04:37.696464 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f" gracePeriod=600 Oct 07 13:04:38 crc kubenswrapper[4959]: I1007 13:04:38.480788 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f" exitCode=0 Oct 07 13:04:38 crc kubenswrapper[4959]: I1007 13:04:38.480916 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f"} Oct 07 13:04:39 crc kubenswrapper[4959]: I1007 13:04:39.815540 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:04:39 crc kubenswrapper[4959]: I1007 13:04:39.816956 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lvndj" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="registry-server" containerID="cri-o://8fc9412578ff452ad024780a9063e0685e90f76b79eb5c4a27916a24bd7ece10" gracePeriod=2 Oct 07 13:04:40 crc kubenswrapper[4959]: I1007 13:04:40.492517 4959 generic.go:334] "Generic (PLEG): container finished" podID="b8fe2270-9af2-4587-8e8d-33696177645a" containerID="8fc9412578ff452ad024780a9063e0685e90f76b79eb5c4a27916a24bd7ece10" exitCode=0 Oct 07 13:04:40 crc kubenswrapper[4959]: I1007 13:04:40.492583 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerDied","Data":"8fc9412578ff452ad024780a9063e0685e90f76b79eb5c4a27916a24bd7ece10"} Oct 07 13:04:40 crc kubenswrapper[4959]: I1007 13:04:40.495247 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerStarted","Data":"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111"} Oct 07 13:04:40 crc kubenswrapper[4959]: I1007 13:04:40.497162 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809"} Oct 07 13:04:40 crc kubenswrapper[4959]: I1007 13:04:40.499327 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerStarted","Data":"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b"} Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.206421 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.223934 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhdcb" podStartSLOduration=3.816760035 podStartE2EDuration="1m17.223912059s" podCreationTimestamp="2025-10-07 13:03:24 +0000 UTC" firstStartedPulling="2025-10-07 13:03:25.836821815 +0000 UTC m=+157.997544502" lastFinishedPulling="2025-10-07 13:04:39.243973859 +0000 UTC m=+231.404696526" observedRunningTime="2025-10-07 13:04:40.511912433 +0000 UTC m=+232.672635130" watchObservedRunningTime="2025-10-07 13:04:41.223912059 +0000 UTC m=+233.384634736" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.358585 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities\") pod \"b8fe2270-9af2-4587-8e8d-33696177645a\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.358646 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7c6\" (UniqueName: \"kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6\") pod \"b8fe2270-9af2-4587-8e8d-33696177645a\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.358670 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content\") pod \"b8fe2270-9af2-4587-8e8d-33696177645a\" (UID: \"b8fe2270-9af2-4587-8e8d-33696177645a\") " Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.363513 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities" (OuterVolumeSpecName: "utilities") pod "b8fe2270-9af2-4587-8e8d-33696177645a" (UID: "b8fe2270-9af2-4587-8e8d-33696177645a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.373309 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6" (OuterVolumeSpecName: "kube-api-access-gk7c6") pod "b8fe2270-9af2-4587-8e8d-33696177645a" (UID: "b8fe2270-9af2-4587-8e8d-33696177645a"). InnerVolumeSpecName "kube-api-access-gk7c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.379148 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8fe2270-9af2-4587-8e8d-33696177645a" (UID: "b8fe2270-9af2-4587-8e8d-33696177645a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.460138 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.460178 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7c6\" (UniqueName: \"kubernetes.io/projected/b8fe2270-9af2-4587-8e8d-33696177645a-kube-api-access-gk7c6\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.460188 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8fe2270-9af2-4587-8e8d-33696177645a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.505836 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvndj" event={"ID":"b8fe2270-9af2-4587-8e8d-33696177645a","Type":"ContainerDied","Data":"613322e16e3ff235de8201873741969faf5be3eff5c6fd71e1d578074ff73497"} Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.505891 4959 scope.go:117] "RemoveContainer" containerID="8fc9412578ff452ad024780a9063e0685e90f76b79eb5c4a27916a24bd7ece10" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.505934 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvndj" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.537296 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfqft" podStartSLOduration=5.668087169 podStartE2EDuration="1m17.537267318s" podCreationTimestamp="2025-10-07 13:03:24 +0000 UTC" firstStartedPulling="2025-10-07 13:03:25.846894939 +0000 UTC m=+158.007617616" lastFinishedPulling="2025-10-07 13:04:37.716075088 +0000 UTC m=+229.876797765" observedRunningTime="2025-10-07 13:04:41.533660831 +0000 UTC m=+233.694383528" watchObservedRunningTime="2025-10-07 13:04:41.537267318 +0000 UTC m=+233.697990005" Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.546818 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.553495 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvndj"] Oct 07 13:04:41 crc kubenswrapper[4959]: I1007 13:04:41.949360 4959 scope.go:117] "RemoveContainer" containerID="e796324023038f6b7f297f7c7caef457891f0e5fc4d6c797e03fd092338433d4" Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.007451 4959 scope.go:117] "RemoveContainer" containerID="508cd8f9aecf950a1cc52e2b2ce2757d9386e9ee85cb513f6bab53ff4815fd28" Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.517315 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerStarted","Data":"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19"} Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.521877 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerStarted","Data":"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9"} Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.545661 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4ch4j" podStartSLOduration=2.458238136 podStartE2EDuration="1m15.545617045s" podCreationTimestamp="2025-10-07 13:03:27 +0000 UTC" firstStartedPulling="2025-10-07 13:03:28.931469049 +0000 UTC m=+161.092191726" lastFinishedPulling="2025-10-07 13:04:42.018847948 +0000 UTC m=+234.179570635" observedRunningTime="2025-10-07 13:04:42.538088352 +0000 UTC m=+234.698811039" watchObservedRunningTime="2025-10-07 13:04:42.545617045 +0000 UTC m=+234.706339742" Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.562983 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f9pzw" podStartSLOduration=2.471037169 podStartE2EDuration="1m15.562956964s" podCreationTimestamp="2025-10-07 13:03:27 +0000 UTC" firstStartedPulling="2025-10-07 13:03:28.941908426 +0000 UTC m=+161.102631103" lastFinishedPulling="2025-10-07 13:04:42.033828211 +0000 UTC m=+234.194550898" observedRunningTime="2025-10-07 13:04:42.559355068 +0000 UTC m=+234.720077755" watchObservedRunningTime="2025-10-07 13:04:42.562956964 +0000 UTC m=+234.723679651" Oct 07 13:04:42 crc kubenswrapper[4959]: I1007 13:04:42.819099 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" path="/var/lib/kubelet/pods/b8fe2270-9af2-4587-8e8d-33696177645a/volumes" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.521176 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.521263 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.570466 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.751033 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.917262 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.917577 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:44 crc kubenswrapper[4959]: I1007 13:04:44.970249 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:45 crc kubenswrapper[4959]: I1007 13:04:45.172737 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:45 crc kubenswrapper[4959]: I1007 13:04:45.583897 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:47 crc kubenswrapper[4959]: I1007 13:04:47.741123 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:04:47 crc kubenswrapper[4959]: I1007 13:04:47.741666 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:04:47 crc kubenswrapper[4959]: I1007 13:04:47.789135 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:04:47 crc kubenswrapper[4959]: I1007 13:04:47.810871 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.112588 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.112701 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.157784 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.560475 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qhdcb" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="registry-server" containerID="cri-o://7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111" gracePeriod=2 Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.598583 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.602152 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.817575 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.818315 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6bptn" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="registry-server" containerID="cri-o://afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2" gracePeriod=2 Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.977305 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.997349 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content\") pod \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.997549 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities\") pod \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " Oct 07 13:04:48 crc kubenswrapper[4959]: I1007 13:04:48.997604 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dtp4\" (UniqueName: \"kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4\") pod \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\" (UID: \"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4\") " Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:48.998948 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities" (OuterVolumeSpecName: "utilities") pod "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" (UID: "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.044743 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4" (OuterVolumeSpecName: "kube-api-access-6dtp4") pod "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" (UID: "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4"). InnerVolumeSpecName "kube-api-access-6dtp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.053843 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" (UID: "e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.098635 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dtp4\" (UniqueName: \"kubernetes.io/projected/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-kube-api-access-6dtp4\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.098661 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.098674 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.230650 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.301680 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82rdc\" (UniqueName: \"kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc\") pod \"b31f5e5a-8410-4933-a446-515c58b1e7c3\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.301798 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content\") pod \"b31f5e5a-8410-4933-a446-515c58b1e7c3\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.301866 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities\") pod \"b31f5e5a-8410-4933-a446-515c58b1e7c3\" (UID: \"b31f5e5a-8410-4933-a446-515c58b1e7c3\") " Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.302931 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities" (OuterVolumeSpecName: "utilities") pod "b31f5e5a-8410-4933-a446-515c58b1e7c3" (UID: "b31f5e5a-8410-4933-a446-515c58b1e7c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.305005 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc" (OuterVolumeSpecName: "kube-api-access-82rdc") pod "b31f5e5a-8410-4933-a446-515c58b1e7c3" (UID: "b31f5e5a-8410-4933-a446-515c58b1e7c3"). InnerVolumeSpecName "kube-api-access-82rdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.356032 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b31f5e5a-8410-4933-a446-515c58b1e7c3" (UID: "b31f5e5a-8410-4933-a446-515c58b1e7c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.403604 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82rdc\" (UniqueName: \"kubernetes.io/projected/b31f5e5a-8410-4933-a446-515c58b1e7c3-kube-api-access-82rdc\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.403798 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.403903 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31f5e5a-8410-4933-a446-515c58b1e7c3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.566756 4959 generic.go:334] "Generic (PLEG): container finished" podID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerID="7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111" exitCode=0 Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.566859 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerDied","Data":"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111"} Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.566901 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhdcb" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.567378 4959 scope.go:117] "RemoveContainer" containerID="7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.567291 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhdcb" event={"ID":"e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4","Type":"ContainerDied","Data":"6e131ac6ee423e445928b579fb30112717e6a08f6cefae69b9dba744a0635a28"} Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.569986 4959 generic.go:334] "Generic (PLEG): container finished" podID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerID="afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2" exitCode=0 Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.570048 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerDied","Data":"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2"} Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.570122 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bptn" event={"ID":"b31f5e5a-8410-4933-a446-515c58b1e7c3","Type":"ContainerDied","Data":"5c5c9c5a5caa7a13dc28a99d147fb0453d4875690442c7333db90373e94194dc"} Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.570124 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bptn" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.588229 4959 scope.go:117] "RemoveContainer" containerID="e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.616935 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.627509 4959 scope.go:117] "RemoveContainer" containerID="d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.628576 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6bptn"] Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.633018 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.635397 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qhdcb"] Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.644936 4959 scope.go:117] "RemoveContainer" containerID="7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.645427 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111\": container with ID starting with 7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111 not found: ID does not exist" containerID="7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.645519 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111"} err="failed to get container status \"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111\": rpc error: code = NotFound desc = could not find container \"7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111\": container with ID starting with 7b3e1c3daf5074c32d52b229b714c0170d1183b43ebca37ee1227d1620f11111 not found: ID does not exist" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.645602 4959 scope.go:117] "RemoveContainer" containerID="e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.646205 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5\": container with ID starting with e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5 not found: ID does not exist" containerID="e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.646261 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5"} err="failed to get container status \"e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5\": rpc error: code = NotFound desc = could not find container \"e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5\": container with ID starting with e7e1153e787b64a93e37c28742710ae480e0fc8939fc6541e02dc11761e86be5 not found: ID does not exist" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.646277 4959 scope.go:117] "RemoveContainer" containerID="d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.646702 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f\": container with ID starting with d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f not found: ID does not exist" containerID="d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.646815 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f"} err="failed to get container status \"d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f\": rpc error: code = NotFound desc = could not find container \"d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f\": container with ID starting with d35b079996338253e5406c1411c66bcf803c592f6c065c59752917c5f5d0273f not found: ID does not exist" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.646924 4959 scope.go:117] "RemoveContainer" containerID="afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.659577 4959 scope.go:117] "RemoveContainer" containerID="fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.675408 4959 scope.go:117] "RemoveContainer" containerID="bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.701585 4959 scope.go:117] "RemoveContainer" containerID="afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.702488 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2\": container with ID starting with afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2 not found: ID does not exist" containerID="afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.702525 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2"} err="failed to get container status \"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2\": rpc error: code = NotFound desc = could not find container \"afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2\": container with ID starting with afe769799fc954e0912bb59c452c49fa204e3a15ab76b684ffdb9b9bf24f6fc2 not found: ID does not exist" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.702555 4959 scope.go:117] "RemoveContainer" containerID="fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.702934 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb\": container with ID starting with fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb not found: ID does not exist" containerID="fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.702961 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb"} err="failed to get container status \"fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb\": rpc error: code = NotFound desc = could not find container \"fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb\": container with ID starting with fa7db0dae317fc5fcd7c4f8ec58d6c82532e6425f1d93270602418a9fdfebbeb not found: ID does not exist" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.702977 4959 scope.go:117] "RemoveContainer" containerID="bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb" Oct 07 13:04:49 crc kubenswrapper[4959]: E1007 13:04:49.703239 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb\": container with ID starting with bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb not found: ID does not exist" containerID="bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb" Oct 07 13:04:49 crc kubenswrapper[4959]: I1007 13:04:49.703264 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb"} err="failed to get container status \"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb\": rpc error: code = NotFound desc = could not find container \"bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb\": container with ID starting with bc8e7eb5ecbbb14d0b27500ec72e9670ee21a51f06caa774e5c20041c84a8ceb not found: ID does not exist" Oct 07 13:04:50 crc kubenswrapper[4959]: I1007 13:04:50.819645 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" path="/var/lib/kubelet/pods/b31f5e5a-8410-4933-a446-515c58b1e7c3/volumes" Oct 07 13:04:50 crc kubenswrapper[4959]: I1007 13:04:50.820551 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" path="/var/lib/kubelet/pods/e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4/volumes" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.210506 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.210735 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4ch4j" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="registry-server" containerID="cri-o://59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19" gracePeriod=2 Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.574254 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.586967 4959 generic.go:334] "Generic (PLEG): container finished" podID="68d5824e-afe0-4c5d-99b0-485644e77905" containerID="59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19" exitCode=0 Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.587005 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerDied","Data":"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19"} Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.587047 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ch4j" event={"ID":"68d5824e-afe0-4c5d-99b0-485644e77905","Type":"ContainerDied","Data":"e645779b47b1fa05076e06676938f9ccec15604489de07f61b618c0ed971382f"} Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.587064 4959 scope.go:117] "RemoveContainer" containerID="59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.587154 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ch4j" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.607278 4959 scope.go:117] "RemoveContainer" containerID="93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.624674 4959 scope.go:117] "RemoveContainer" containerID="19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.641499 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content\") pod \"68d5824e-afe0-4c5d-99b0-485644e77905\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.641540 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xzph\" (UniqueName: \"kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph\") pod \"68d5824e-afe0-4c5d-99b0-485644e77905\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.641575 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities\") pod \"68d5824e-afe0-4c5d-99b0-485644e77905\" (UID: \"68d5824e-afe0-4c5d-99b0-485644e77905\") " Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.641933 4959 scope.go:117] "RemoveContainer" containerID="59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.642316 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities" (OuterVolumeSpecName: "utilities") pod "68d5824e-afe0-4c5d-99b0-485644e77905" (UID: "68d5824e-afe0-4c5d-99b0-485644e77905"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:51 crc kubenswrapper[4959]: E1007 13:04:51.642585 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19\": container with ID starting with 59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19 not found: ID does not exist" containerID="59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.642618 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19"} err="failed to get container status \"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19\": rpc error: code = NotFound desc = could not find container \"59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19\": container with ID starting with 59893c75e34c7a917868b3ebe34742756172da9d9ec8102f7f7c4d34e63e8d19 not found: ID does not exist" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.642664 4959 scope.go:117] "RemoveContainer" containerID="93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7" Oct 07 13:04:51 crc kubenswrapper[4959]: E1007 13:04:51.643155 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7\": container with ID starting with 93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7 not found: ID does not exist" containerID="93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.643176 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7"} err="failed to get container status \"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7\": rpc error: code = NotFound desc = could not find container \"93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7\": container with ID starting with 93ec9f5dde622367e6851517353585153ea664a15b21de0632e3df31b04f93e7 not found: ID does not exist" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.643192 4959 scope.go:117] "RemoveContainer" containerID="19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2" Oct 07 13:04:51 crc kubenswrapper[4959]: E1007 13:04:51.643455 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2\": container with ID starting with 19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2 not found: ID does not exist" containerID="19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.643479 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2"} err="failed to get container status \"19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2\": rpc error: code = NotFound desc = could not find container \"19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2\": container with ID starting with 19a0c8fea6516162419cea432b9ad94fbe3ef055df3c44130091b4aeca7d8ab2 not found: ID does not exist" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.649943 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph" (OuterVolumeSpecName: "kube-api-access-7xzph") pod "68d5824e-afe0-4c5d-99b0-485644e77905" (UID: "68d5824e-afe0-4c5d-99b0-485644e77905"). InnerVolumeSpecName "kube-api-access-7xzph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.728222 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68d5824e-afe0-4c5d-99b0-485644e77905" (UID: "68d5824e-afe0-4c5d-99b0-485644e77905"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.743922 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.744200 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xzph\" (UniqueName: \"kubernetes.io/projected/68d5824e-afe0-4c5d-99b0-485644e77905-kube-api-access-7xzph\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.744219 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d5824e-afe0-4c5d-99b0-485644e77905-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.919848 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:04:51 crc kubenswrapper[4959]: I1007 13:04:51.923194 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4ch4j"] Oct 07 13:04:52 crc kubenswrapper[4959]: I1007 13:04:52.816182 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" path="/var/lib/kubelet/pods/68d5824e-afe0-4c5d-99b0-485644e77905/volumes" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507431 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m7kr6"] Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507778 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507794 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507803 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507815 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507826 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09377877-6bb1-428f-8957-6178c779bfb8" containerName="pruner" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507835 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="09377877-6bb1-428f-8957-6178c779bfb8" containerName="pruner" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507847 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507853 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507860 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507866 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507877 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507883 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507892 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507898 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="extract-utilities" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507911 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507916 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507925 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507930 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507940 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507948 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507959 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507965 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507977 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.507983 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="extract-content" Oct 07 13:04:53 crc kubenswrapper[4959]: E1007 13:04:53.507995 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508001 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508107 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="09377877-6bb1-428f-8957-6178c779bfb8" containerName="pruner" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508118 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cf0a35-1af6-4c0c-b8e0-f7a0727071d4" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508126 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d5824e-afe0-4c5d-99b0-485644e77905" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508142 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fe2270-9af2-4587-8e8d-33696177645a" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508157 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31f5e5a-8410-4933-a446-515c58b1e7c3" containerName="registry-server" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.508677 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.523898 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m7kr6"] Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569313 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4dz\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-kube-api-access-9z4dz\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569361 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569435 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569499 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-certificates\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569528 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-trusted-ca\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569583 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569799 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-bound-sa-token\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.569849 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-tls\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.599309 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671452 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-bound-sa-token\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671495 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-tls\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671538 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4dz\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-kube-api-access-9z4dz\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671560 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671594 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671642 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-certificates\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.671673 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-trusted-ca\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.672578 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.673209 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-certificates\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.673575 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-trusted-ca\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.679657 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.690339 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-registry-tls\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.691258 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-bound-sa-token\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.694317 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4dz\" (UniqueName: \"kubernetes.io/projected/b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648-kube-api-access-9z4dz\") pod \"image-registry-66df7c8f76-m7kr6\" (UID: \"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648\") " pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:53 crc kubenswrapper[4959]: I1007 13:04:53.824153 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.252224 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m7kr6"] Oct 07 13:04:54 crc kubenswrapper[4959]: W1007 13:04:54.257717 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e9cdbc_4d7c_4cd5_9aab_386dbeb12648.slice/crio-a43f9a321b05123c3468fb5cd10d71bd27fa9a4e66ec0bd90092473670367bba WatchSource:0}: Error finding container a43f9a321b05123c3468fb5cd10d71bd27fa9a4e66ec0bd90092473670367bba: Status 404 returned error can't find the container with id a43f9a321b05123c3468fb5cd10d71bd27fa9a4e66ec0bd90092473670367bba Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.566601 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.610194 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" event={"ID":"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648","Type":"ContainerStarted","Data":"73291e22e923bb29737146e152c6030a128d4b7b7faf03fc703967a572e247f3"} Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.610235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" event={"ID":"b6e9cdbc-4d7c-4cd5-9aab-386dbeb12648","Type":"ContainerStarted","Data":"a43f9a321b05123c3468fb5cd10d71bd27fa9a4e66ec0bd90092473670367bba"} Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.610324 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.633063 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" podStartSLOduration=1.633047559 podStartE2EDuration="1.633047559s" podCreationTimestamp="2025-10-07 13:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:04:54.630673993 +0000 UTC m=+246.791396690" watchObservedRunningTime="2025-10-07 13:04:54.633047559 +0000 UTC m=+246.793770226" Oct 07 13:04:54 crc kubenswrapper[4959]: I1007 13:04:54.858409 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:05:13 crc kubenswrapper[4959]: I1007 13:05:13.832645 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-m7kr6" Oct 07 13:05:13 crc kubenswrapper[4959]: I1007 13:05:13.927211 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:05:19 crc kubenswrapper[4959]: I1007 13:05:19.892299 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" containerID="cri-o://fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c" gracePeriod=15 Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.324982 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.353153 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l"] Oct 07 13:05:20 crc kubenswrapper[4959]: E1007 13:05:20.353479 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.353497 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.353666 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerName="oauth-openshift" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.354268 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.377973 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l"] Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.510078 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x968\" (UniqueName: \"kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.510132 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.510223 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.510244 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511044 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511072 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511095 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511116 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511160 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511258 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511275 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511306 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511335 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511373 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir\") pod \"5ed6f47e-1445-40fb-a469-690dc49e5974\" (UID: \"5ed6f47e-1445-40fb-a469-690dc49e5974\") " Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511513 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511555 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsh8d\" (UniqueName: \"kubernetes.io/projected/2ea06d03-c89b-4609-b88f-b13df23ac616-kube-api-access-jsh8d\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511594 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511647 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511672 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511689 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511720 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511740 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511758 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511780 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511798 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511820 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511864 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511886 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511908 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.511941 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.512296 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.512363 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.512789 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.513144 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.515929 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.516188 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968" (OuterVolumeSpecName: "kube-api-access-6x968") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "kube-api-access-6x968". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.516445 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.516438 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.516696 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.517367 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.517529 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.517838 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.518170 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5ed6f47e-1445-40fb-a469-690dc49e5974" (UID: "5ed6f47e-1445-40fb-a469-690dc49e5974"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613383 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613440 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613462 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613484 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613516 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613504 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613536 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613667 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613775 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613817 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613907 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613971 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.613997 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614024 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614112 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsh8d\" (UniqueName: \"kubernetes.io/projected/2ea06d03-c89b-4609-b88f-b13df23ac616-kube-api-access-jsh8d\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614192 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614210 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614225 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614239 4959 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614255 4959 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed6f47e-1445-40fb-a469-690dc49e5974-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614270 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x968\" (UniqueName: \"kubernetes.io/projected/5ed6f47e-1445-40fb-a469-690dc49e5974-kube-api-access-6x968\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614284 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614299 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614324 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614340 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614356 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614369 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614383 4959 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed6f47e-1445-40fb-a469-690dc49e5974-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.614448 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.615804 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.617016 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.617486 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.618093 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.618402 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.618732 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.620045 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.620663 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.620806 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.620939 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.621316 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ea06d03-c89b-4609-b88f-b13df23ac616-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.635902 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsh8d\" (UniqueName: \"kubernetes.io/projected/2ea06d03-c89b-4609-b88f-b13df23ac616-kube-api-access-jsh8d\") pod \"oauth-openshift-69bcbbd7f8-69c6l\" (UID: \"2ea06d03-c89b-4609-b88f-b13df23ac616\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.674546 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.770413 4959 generic.go:334] "Generic (PLEG): container finished" podID="5ed6f47e-1445-40fb-a469-690dc49e5974" containerID="fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c" exitCode=0 Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.770460 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" event={"ID":"5ed6f47e-1445-40fb-a469-690dc49e5974","Type":"ContainerDied","Data":"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c"} Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.770486 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" event={"ID":"5ed6f47e-1445-40fb-a469-690dc49e5974","Type":"ContainerDied","Data":"ddbbbbb319785389f2fa4b757f8793f8fa45bc6ea189eb54295b9917597c4639"} Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.770502 4959 scope.go:117] "RemoveContainer" containerID="fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.770514 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gct22" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.813412 4959 scope.go:117] "RemoveContainer" containerID="fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c" Oct 07 13:05:20 crc kubenswrapper[4959]: E1007 13:05:20.816436 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c\": container with ID starting with fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c not found: ID does not exist" containerID="fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.816537 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c"} err="failed to get container status \"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c\": rpc error: code = NotFound desc = could not find container \"fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c\": container with ID starting with fe5561c5a36d66eb498421839be898bd58fb14d561ac4716adf6b6ae10b5a72c not found: ID does not exist" Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.831389 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:05:20 crc kubenswrapper[4959]: I1007 13:05:20.831440 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gct22"] Oct 07 13:05:21 crc kubenswrapper[4959]: I1007 13:05:21.126292 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l"] Oct 07 13:05:21 crc kubenswrapper[4959]: I1007 13:05:21.779735 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" event={"ID":"2ea06d03-c89b-4609-b88f-b13df23ac616","Type":"ContainerStarted","Data":"ae7176a4ff7046389c4218f40295af7bace7bdde9dcd441ac4ba18cad0fac28d"} Oct 07 13:05:21 crc kubenswrapper[4959]: I1007 13:05:21.780303 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:21 crc kubenswrapper[4959]: I1007 13:05:21.780338 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" event={"ID":"2ea06d03-c89b-4609-b88f-b13df23ac616","Type":"ContainerStarted","Data":"d4e184d5a484a32578b76054425ec96af7236eabad8f95518c2b4d5c920848d3"} Oct 07 13:05:21 crc kubenswrapper[4959]: I1007 13:05:21.822525 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" podStartSLOduration=27.822496955 podStartE2EDuration="27.822496955s" podCreationTimestamp="2025-10-07 13:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:05:21.815296872 +0000 UTC m=+273.976019600" watchObservedRunningTime="2025-10-07 13:05:21.822496955 +0000 UTC m=+273.983219642" Oct 07 13:05:22 crc kubenswrapper[4959]: I1007 13:05:22.212857 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-69c6l" Oct 07 13:05:22 crc kubenswrapper[4959]: I1007 13:05:22.819968 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed6f47e-1445-40fb-a469-690dc49e5974" path="/var/lib/kubelet/pods/5ed6f47e-1445-40fb-a469-690dc49e5974/volumes" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.020907 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.022006 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8vtzh" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="registry-server" containerID="cri-o://b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4" gracePeriod=30 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.030041 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.030396 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dfqft" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="registry-server" containerID="cri-o://56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" gracePeriod=30 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.046930 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.047271 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" containerID="cri-o://3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31" gracePeriod=30 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.059957 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.060358 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s57zz" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="registry-server" containerID="cri-o://308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab" gracePeriod=30 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.066474 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.067670 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f9pzw" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="registry-server" containerID="cri-o://fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9" gracePeriod=30 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.082540 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbtxf"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.083381 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.109823 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbtxf"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.150133 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.150172 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z6kw\" (UniqueName: \"kubernetes.io/projected/0548f538-781a-406b-8d2c-4449281cc77c-kube-api-access-8z6kw\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.150237 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.251737 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.251829 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.251856 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z6kw\" (UniqueName: \"kubernetes.io/projected/0548f538-781a-406b-8d2c-4449281cc77c-kube-api-access-8z6kw\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.253711 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.261788 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0548f538-781a-406b-8d2c-4449281cc77c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.269772 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z6kw\" (UniqueName: \"kubernetes.io/projected/0548f538-781a-406b-8d2c-4449281cc77c-kube-api-access-8z6kw\") pod \"marketplace-operator-79b997595-fbtxf\" (UID: \"0548f538-781a-406b-8d2c-4449281cc77c\") " pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.473813 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.481863 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.523836 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b is running failed: container process not found" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.524361 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b is running failed: container process not found" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.527208 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.527448 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b is running failed: container process not found" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.527479 4959 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-dfqft" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="registry-server" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.542972 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.578139 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.593066 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703187 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content\") pod \"04b359b0-6002-4de2-912e-e5068f8fe8fe\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703259 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") pod \"85df292a-1000-48f0-be15-823ada38a57b\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703292 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities\") pod \"cd5b4f5b-d699-457d-8363-def61c21613b\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703315 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjth\" (UniqueName: \"kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth\") pod \"04b359b0-6002-4de2-912e-e5068f8fe8fe\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703349 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2nh\" (UniqueName: \"kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh\") pod \"726f45c1-32a5-47f2-9540-d6ab00654be3\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703377 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx2ld\" (UniqueName: \"kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld\") pod \"c41be072-ce09-432f-8ea3-b1a0363a74df\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703405 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") pod \"85df292a-1000-48f0-be15-823ada38a57b\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703433 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds8hv\" (UniqueName: \"kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv\") pod \"85df292a-1000-48f0-be15-823ada38a57b\" (UID: \"85df292a-1000-48f0-be15-823ada38a57b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703461 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities\") pod \"726f45c1-32a5-47f2-9540-d6ab00654be3\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703484 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnjgk\" (UniqueName: \"kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk\") pod \"cd5b4f5b-d699-457d-8363-def61c21613b\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703526 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content\") pod \"c41be072-ce09-432f-8ea3-b1a0363a74df\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703559 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content\") pod \"cd5b4f5b-d699-457d-8363-def61c21613b\" (UID: \"cd5b4f5b-d699-457d-8363-def61c21613b\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703586 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities\") pod \"c41be072-ce09-432f-8ea3-b1a0363a74df\" (UID: \"c41be072-ce09-432f-8ea3-b1a0363a74df\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703609 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities\") pod \"04b359b0-6002-4de2-912e-e5068f8fe8fe\" (UID: \"04b359b0-6002-4de2-912e-e5068f8fe8fe\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.703692 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content\") pod \"726f45c1-32a5-47f2-9540-d6ab00654be3\" (UID: \"726f45c1-32a5-47f2-9540-d6ab00654be3\") " Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.705365 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities" (OuterVolumeSpecName: "utilities") pod "c41be072-ce09-432f-8ea3-b1a0363a74df" (UID: "c41be072-ce09-432f-8ea3-b1a0363a74df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.705487 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities" (OuterVolumeSpecName: "utilities") pod "04b359b0-6002-4de2-912e-e5068f8fe8fe" (UID: "04b359b0-6002-4de2-912e-e5068f8fe8fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.707576 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities" (OuterVolumeSpecName: "utilities") pod "cd5b4f5b-d699-457d-8363-def61c21613b" (UID: "cd5b4f5b-d699-457d-8363-def61c21613b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.708008 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities" (OuterVolumeSpecName: "utilities") pod "726f45c1-32a5-47f2-9540-d6ab00654be3" (UID: "726f45c1-32a5-47f2-9540-d6ab00654be3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.709204 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "85df292a-1000-48f0-be15-823ada38a57b" (UID: "85df292a-1000-48f0-be15-823ada38a57b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.716221 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth" (OuterVolumeSpecName: "kube-api-access-gvjth") pod "04b359b0-6002-4de2-912e-e5068f8fe8fe" (UID: "04b359b0-6002-4de2-912e-e5068f8fe8fe"). InnerVolumeSpecName "kube-api-access-gvjth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.716251 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld" (OuterVolumeSpecName: "kube-api-access-qx2ld") pod "c41be072-ce09-432f-8ea3-b1a0363a74df" (UID: "c41be072-ce09-432f-8ea3-b1a0363a74df"). InnerVolumeSpecName "kube-api-access-qx2ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.716263 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv" (OuterVolumeSpecName: "kube-api-access-ds8hv") pod "85df292a-1000-48f0-be15-823ada38a57b" (UID: "85df292a-1000-48f0-be15-823ada38a57b"). InnerVolumeSpecName "kube-api-access-ds8hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.716316 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh" (OuterVolumeSpecName: "kube-api-access-mz2nh") pod "726f45c1-32a5-47f2-9540-d6ab00654be3" (UID: "726f45c1-32a5-47f2-9540-d6ab00654be3"). InnerVolumeSpecName "kube-api-access-mz2nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.716299 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "85df292a-1000-48f0-be15-823ada38a57b" (UID: "85df292a-1000-48f0-be15-823ada38a57b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.717969 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk" (OuterVolumeSpecName: "kube-api-access-lnjgk") pod "cd5b4f5b-d699-457d-8363-def61c21613b" (UID: "cd5b4f5b-d699-457d-8363-def61c21613b"). InnerVolumeSpecName "kube-api-access-lnjgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.722548 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd5b4f5b-d699-457d-8363-def61c21613b" (UID: "cd5b4f5b-d699-457d-8363-def61c21613b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.768513 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04b359b0-6002-4de2-912e-e5068f8fe8fe" (UID: "04b359b0-6002-4de2-912e-e5068f8fe8fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.768517 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "726f45c1-32a5-47f2-9540-d6ab00654be3" (UID: "726f45c1-32a5-47f2-9540-d6ab00654be3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.803868 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c41be072-ce09-432f-8ea3-b1a0363a74df" (UID: "c41be072-ce09-432f-8ea3-b1a0363a74df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805284 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805313 4959 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85df292a-1000-48f0-be15-823ada38a57b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805330 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805361 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjth\" (UniqueName: \"kubernetes.io/projected/04b359b0-6002-4de2-912e-e5068f8fe8fe-kube-api-access-gvjth\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805370 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2nh\" (UniqueName: \"kubernetes.io/projected/726f45c1-32a5-47f2-9540-d6ab00654be3-kube-api-access-mz2nh\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805380 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx2ld\" (UniqueName: \"kubernetes.io/projected/c41be072-ce09-432f-8ea3-b1a0363a74df-kube-api-access-qx2ld\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805389 4959 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85df292a-1000-48f0-be15-823ada38a57b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805398 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds8hv\" (UniqueName: \"kubernetes.io/projected/85df292a-1000-48f0-be15-823ada38a57b-kube-api-access-ds8hv\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805408 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805436 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnjgk\" (UniqueName: \"kubernetes.io/projected/cd5b4f5b-d699-457d-8363-def61c21613b-kube-api-access-lnjgk\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805447 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805455 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5b4f5b-d699-457d-8363-def61c21613b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805464 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c41be072-ce09-432f-8ea3-b1a0363a74df-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805472 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b359b0-6002-4de2-912e-e5068f8fe8fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.805480 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726f45c1-32a5-47f2-9540-d6ab00654be3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.869900 4959 generic.go:334] "Generic (PLEG): container finished" podID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" exitCode=0 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.869965 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerDied","Data":"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.869995 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfqft" event={"ID":"726f45c1-32a5-47f2-9540-d6ab00654be3","Type":"ContainerDied","Data":"126e196d639536ed1a2238822e0fdfe5a696ae17ce18ad18fb4f36ecbb5458ec"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.870013 4959 scope.go:117] "RemoveContainer" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.870008 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfqft" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.872516 4959 generic.go:334] "Generic (PLEG): container finished" podID="85df292a-1000-48f0-be15-823ada38a57b" containerID="3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31" exitCode=0 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.872596 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" event={"ID":"85df292a-1000-48f0-be15-823ada38a57b","Type":"ContainerDied","Data":"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.872648 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" event={"ID":"85df292a-1000-48f0-be15-823ada38a57b","Type":"ContainerDied","Data":"66040e68dddc6727095941060674080531c4f1b75456c420e9a99776e95582e4"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.872730 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z65mr" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.876215 4959 generic.go:334] "Generic (PLEG): container finished" podID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerID="fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9" exitCode=0 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.876267 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerDied","Data":"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.876293 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9pzw" event={"ID":"c41be072-ce09-432f-8ea3-b1a0363a74df","Type":"ContainerDied","Data":"a9ef56416378f0c229e3d48532f7c699834cef148761cea250fcff76abde8458"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.876356 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9pzw" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.880198 4959 generic.go:334] "Generic (PLEG): container finished" podID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerID="b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4" exitCode=0 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.880284 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerDied","Data":"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.880326 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vtzh" event={"ID":"04b359b0-6002-4de2-912e-e5068f8fe8fe","Type":"ContainerDied","Data":"3380c178882e341fdf633f216bdcb46a1b9213707cad855e57381294395d3272"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.880325 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vtzh" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.886467 4959 generic.go:334] "Generic (PLEG): container finished" podID="cd5b4f5b-d699-457d-8363-def61c21613b" containerID="308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab" exitCode=0 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.886515 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerDied","Data":"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.886552 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s57zz" event={"ID":"cd5b4f5b-d699-457d-8363-def61c21613b","Type":"ContainerDied","Data":"5a0317a3bb02c0c9f79a0ba2891f61c4eb8563ee7686885e011b0603765c7ea6"} Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.886620 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s57zz" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.892418 4959 scope.go:117] "RemoveContainer" containerID="e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.903014 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.911664 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8vtzh"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.932968 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.934964 4959 scope.go:117] "RemoveContainer" containerID="cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.941979 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f9pzw"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.952618 4959 scope.go:117] "RemoveContainer" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.954940 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b\": container with ID starting with 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b not found: ID does not exist" containerID="56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.955016 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b"} err="failed to get container status \"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b\": rpc error: code = NotFound desc = could not find container \"56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b\": container with ID starting with 56b5eb4ab2bbc2bd6c2581b0a4babfe585bbcd9e32ea1e583686d27fa0f3d39b not found: ID does not exist" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.955077 4959 scope.go:117] "RemoveContainer" containerID="e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.955977 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fbtxf"] Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.957407 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5\": container with ID starting with e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5 not found: ID does not exist" containerID="e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.957484 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5"} err="failed to get container status \"e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5\": rpc error: code = NotFound desc = could not find container \"e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5\": container with ID starting with e2968ae937a25b6f6e008047562b97245b1ffcb629d24ffd8fed7e5fba2da0f5 not found: ID does not exist" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.957521 4959 scope.go:117] "RemoveContainer" containerID="cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0" Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.958255 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0\": container with ID starting with cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0 not found: ID does not exist" containerID="cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.958284 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0"} err="failed to get container status \"cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0\": rpc error: code = NotFound desc = could not find container \"cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0\": container with ID starting with cdc6c9649fb701de3d3ce5d8efb424e81491eb98b531b57614f5a38627e9bfe0 not found: ID does not exist" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.958300 4959 scope.go:117] "RemoveContainer" containerID="3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.966242 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:05:34 crc kubenswrapper[4959]: W1007 13:05:34.966704 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0548f538_781a_406b_8d2c_4449281cc77c.slice/crio-d1d1454b0d9a307bc63b9674becf195484458cb4e4fea96c786e8d6c8f79d028 WatchSource:0}: Error finding container d1d1454b0d9a307bc63b9674becf195484458cb4e4fea96c786e8d6c8f79d028: Status 404 returned error can't find the container with id d1d1454b0d9a307bc63b9674becf195484458cb4e4fea96c786e8d6c8f79d028 Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.968934 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z65mr"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.972483 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.978417 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dfqft"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.987295 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.989353 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s57zz"] Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.991512 4959 scope.go:117] "RemoveContainer" containerID="3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31" Oct 07 13:05:34 crc kubenswrapper[4959]: E1007 13:05:34.992374 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31\": container with ID starting with 3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31 not found: ID does not exist" containerID="3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.992410 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31"} err="failed to get container status \"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31\": rpc error: code = NotFound desc = could not find container \"3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31\": container with ID starting with 3d409b7fc671d8a8fb58d7f9d09949d07cf2b827010e386cbc81e6a3c816ef31 not found: ID does not exist" Oct 07 13:05:34 crc kubenswrapper[4959]: I1007 13:05:34.992438 4959 scope.go:117] "RemoveContainer" containerID="fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.011478 4959 scope.go:117] "RemoveContainer" containerID="110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.037737 4959 scope.go:117] "RemoveContainer" containerID="aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.075850 4959 scope.go:117] "RemoveContainer" containerID="fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.079680 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9\": container with ID starting with fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9 not found: ID does not exist" containerID="fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.079727 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9"} err="failed to get container status \"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9\": rpc error: code = NotFound desc = could not find container \"fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9\": container with ID starting with fcf00dd05383fd7abacb5530ee78a27c2fc1672f59a4435f448aa7891a58d4e9 not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.079762 4959 scope.go:117] "RemoveContainer" containerID="110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.080613 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92\": container with ID starting with 110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92 not found: ID does not exist" containerID="110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.080655 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92"} err="failed to get container status \"110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92\": rpc error: code = NotFound desc = could not find container \"110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92\": container with ID starting with 110f4a874f8ba9e9b11b40e26d1a11aa3f2d30f92e5b4fda6c2eb0a4fe4e9d92 not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.080668 4959 scope.go:117] "RemoveContainer" containerID="aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.080945 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4\": container with ID starting with aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4 not found: ID does not exist" containerID="aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.080975 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4"} err="failed to get container status \"aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4\": rpc error: code = NotFound desc = could not find container \"aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4\": container with ID starting with aefa28fe448ab6f1eb7ccb7e3675899c16fb730e4de74e448e27a629b76f1ae4 not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.080992 4959 scope.go:117] "RemoveContainer" containerID="b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.103432 4959 scope.go:117] "RemoveContainer" containerID="7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.119993 4959 scope.go:117] "RemoveContainer" containerID="a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.145649 4959 scope.go:117] "RemoveContainer" containerID="b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.148733 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4\": container with ID starting with b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4 not found: ID does not exist" containerID="b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.148814 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4"} err="failed to get container status \"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4\": rpc error: code = NotFound desc = could not find container \"b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4\": container with ID starting with b8f2e0261895006b0aad120f6f44936f31b0a6826fcaaadd7d5d53b89ac82bc4 not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.148849 4959 scope.go:117] "RemoveContainer" containerID="7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.149473 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16\": container with ID starting with 7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16 not found: ID does not exist" containerID="7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.149505 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16"} err="failed to get container status \"7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16\": rpc error: code = NotFound desc = could not find container \"7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16\": container with ID starting with 7f2d4cc486c247c0779b1978aebf236f6cec556e325d162aee9a99048fd92f16 not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.149524 4959 scope.go:117] "RemoveContainer" containerID="a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.149888 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f\": container with ID starting with a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f not found: ID does not exist" containerID="a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.149941 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f"} err="failed to get container status \"a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f\": rpc error: code = NotFound desc = could not find container \"a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f\": container with ID starting with a7a92873e9e7d260eb3cd6575ef88409b59b564fc3b11f9cbb783a8225f2204f not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.149964 4959 scope.go:117] "RemoveContainer" containerID="308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.164767 4959 scope.go:117] "RemoveContainer" containerID="9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.244612 4959 scope.go:117] "RemoveContainer" containerID="002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.258522 4959 scope.go:117] "RemoveContainer" containerID="308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.259087 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab\": container with ID starting with 308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab not found: ID does not exist" containerID="308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.259125 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab"} err="failed to get container status \"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab\": rpc error: code = NotFound desc = could not find container \"308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab\": container with ID starting with 308a1eae4bfeb3dd0e52f1a38dd52e9ce76d148b0185aa03630d6c6e735488ab not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.259154 4959 scope.go:117] "RemoveContainer" containerID="9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.259530 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e\": container with ID starting with 9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e not found: ID does not exist" containerID="9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.259564 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e"} err="failed to get container status \"9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e\": rpc error: code = NotFound desc = could not find container \"9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e\": container with ID starting with 9e9e041970f2a66b670eee0b985d4545dfbb938c6ac000a67c2897db425f197e not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.259584 4959 scope.go:117] "RemoveContainer" containerID="002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f" Oct 07 13:05:35 crc kubenswrapper[4959]: E1007 13:05:35.259911 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f\": container with ID starting with 002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f not found: ID does not exist" containerID="002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.259929 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f"} err="failed to get container status \"002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f\": rpc error: code = NotFound desc = could not find container \"002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f\": container with ID starting with 002d9d5259b17a0934caae8d0ff7c54367f7d6f98fba3da2d3d8b5926e8a712f not found: ID does not exist" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.900158 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" event={"ID":"0548f538-781a-406b-8d2c-4449281cc77c","Type":"ContainerStarted","Data":"90c818963c49dd7a4dc55ff71dafa6a0eda07872096d4c95ebf366391a022e35"} Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.900204 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" event={"ID":"0548f538-781a-406b-8d2c-4449281cc77c","Type":"ContainerStarted","Data":"d1d1454b0d9a307bc63b9674becf195484458cb4e4fea96c786e8d6c8f79d028"} Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.901363 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.904939 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" Oct 07 13:05:35 crc kubenswrapper[4959]: I1007 13:05:35.924967 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fbtxf" podStartSLOduration=1.9249532390000001 podStartE2EDuration="1.924953239s" podCreationTimestamp="2025-10-07 13:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:05:35.924806664 +0000 UTC m=+288.085529361" watchObservedRunningTime="2025-10-07 13:05:35.924953239 +0000 UTC m=+288.085675916" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.800596 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kk98t"] Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801466 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801484 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801497 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801505 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801518 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801526 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801538 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801545 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801558 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801566 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801577 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801583 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801597 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801604 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801612 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801621 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801652 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801660 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="extract-utilities" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801670 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801677 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801687 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801693 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801703 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801711 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="extract-content" Oct 07 13:05:36 crc kubenswrapper[4959]: E1007 13:05:36.801719 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801726 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801851 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801867 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801878 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="85df292a-1000-48f0-be15-823ada38a57b" containerName="marketplace-operator" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801891 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.801900 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" containerName="registry-server" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.803806 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.808528 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.822821 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b359b0-6002-4de2-912e-e5068f8fe8fe" path="/var/lib/kubelet/pods/04b359b0-6002-4de2-912e-e5068f8fe8fe/volumes" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.823825 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726f45c1-32a5-47f2-9540-d6ab00654be3" path="/var/lib/kubelet/pods/726f45c1-32a5-47f2-9540-d6ab00654be3/volumes" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.824710 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85df292a-1000-48f0-be15-823ada38a57b" path="/var/lib/kubelet/pods/85df292a-1000-48f0-be15-823ada38a57b/volumes" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.827789 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41be072-ce09-432f-8ea3-b1a0363a74df" path="/var/lib/kubelet/pods/c41be072-ce09-432f-8ea3-b1a0363a74df/volumes" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.828558 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5b4f5b-d699-457d-8363-def61c21613b" path="/var/lib/kubelet/pods/cd5b4f5b-d699-457d-8363-def61c21613b/volumes" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.830033 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kk98t"] Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.838589 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-utilities\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.838677 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-catalog-content\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.838722 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf94b\" (UniqueName: \"kubernetes.io/projected/6f310322-04af-455b-9cbc-d49bd49aec71-kube-api-access-jf94b\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.940555 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf94b\" (UniqueName: \"kubernetes.io/projected/6f310322-04af-455b-9cbc-d49bd49aec71-kube-api-access-jf94b\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.940866 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-utilities\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.940928 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-catalog-content\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.941487 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-utilities\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.941842 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f310322-04af-455b-9cbc-d49bd49aec71-catalog-content\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:36 crc kubenswrapper[4959]: I1007 13:05:36.967851 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf94b\" (UniqueName: \"kubernetes.io/projected/6f310322-04af-455b-9cbc-d49bd49aec71-kube-api-access-jf94b\") pod \"redhat-marketplace-kk98t\" (UID: \"6f310322-04af-455b-9cbc-d49bd49aec71\") " pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:36.998497 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.002226 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.004948 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.011551 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.044130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwv5\" (UniqueName: \"kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.044242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.044318 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.126159 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.146240 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.146364 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.146494 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwv5\" (UniqueName: \"kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.148494 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.148669 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.174432 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwv5\" (UniqueName: \"kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5\") pod \"redhat-operators-pf2tx\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.328677 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.366749 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kk98t"] Oct 07 13:05:37 crc kubenswrapper[4959]: W1007 13:05:37.378210 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f310322_04af_455b_9cbc_d49bd49aec71.slice/crio-d7ac96e69cf40acc15d35a14f2f32dc1af930b55f54e1b05e83a2ccca22191e7 WatchSource:0}: Error finding container d7ac96e69cf40acc15d35a14f2f32dc1af930b55f54e1b05e83a2ccca22191e7: Status 404 returned error can't find the container with id d7ac96e69cf40acc15d35a14f2f32dc1af930b55f54e1b05e83a2ccca22191e7 Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.737745 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.923323 4959 generic.go:334] "Generic (PLEG): container finished" podID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerID="a55b485333f098d979a21beaecc0a14acfcce37c94af187bdceae66d2335982d" exitCode=0 Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.923408 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerDied","Data":"a55b485333f098d979a21beaecc0a14acfcce37c94af187bdceae66d2335982d"} Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.923646 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerStarted","Data":"f9aec09891a36d308dc0775f04bc1dfa0e0afdf06998df3cd5b91c33115660c5"} Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.926928 4959 generic.go:334] "Generic (PLEG): container finished" podID="6f310322-04af-455b-9cbc-d49bd49aec71" containerID="55f527b967490168921901481cfe23de761a4c0a68f9462cf976d3b937b84048" exitCode=0 Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.927051 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kk98t" event={"ID":"6f310322-04af-455b-9cbc-d49bd49aec71","Type":"ContainerDied","Data":"55f527b967490168921901481cfe23de761a4c0a68f9462cf976d3b937b84048"} Oct 07 13:05:37 crc kubenswrapper[4959]: I1007 13:05:37.927112 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kk98t" event={"ID":"6f310322-04af-455b-9cbc-d49bd49aec71","Type":"ContainerStarted","Data":"d7ac96e69cf40acc15d35a14f2f32dc1af930b55f54e1b05e83a2ccca22191e7"} Oct 07 13:05:38 crc kubenswrapper[4959]: I1007 13:05:38.934914 4959 generic.go:334] "Generic (PLEG): container finished" podID="6f310322-04af-455b-9cbc-d49bd49aec71" containerID="c7ab467dc411c8d71605f91dd0c5edaeb7cf8f6ca51c934789b92302992315d4" exitCode=0 Oct 07 13:05:38 crc kubenswrapper[4959]: I1007 13:05:38.934974 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kk98t" event={"ID":"6f310322-04af-455b-9cbc-d49bd49aec71","Type":"ContainerDied","Data":"c7ab467dc411c8d71605f91dd0c5edaeb7cf8f6ca51c934789b92302992315d4"} Oct 07 13:05:38 crc kubenswrapper[4959]: I1007 13:05:38.987427 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" podUID="14600350-80fe-4397-8fd8-02c6139cd9d6" containerName="registry" containerID="cri-o://e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69" gracePeriod=30 Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.200383 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2p6jh"] Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.201760 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.205710 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.221119 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2p6jh"] Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.284748 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-utilities\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.284840 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-catalog-content\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.284877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zk5k\" (UniqueName: \"kubernetes.io/projected/0714844d-6d03-4a23-9611-e02495624e6d-kube-api-access-5zk5k\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.385842 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zk5k\" (UniqueName: \"kubernetes.io/projected/0714844d-6d03-4a23-9611-e02495624e6d-kube-api-access-5zk5k\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.386001 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-utilities\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.386031 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-catalog-content\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.387162 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-catalog-content\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.387557 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0714844d-6d03-4a23-9611-e02495624e6d-utilities\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.415752 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.419474 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.419938 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zk5k\" (UniqueName: \"kubernetes.io/projected/0714844d-6d03-4a23-9611-e02495624e6d-kube-api-access-5zk5k\") pod \"community-operators-2p6jh\" (UID: \"0714844d-6d03-4a23-9611-e02495624e6d\") " pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.420062 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.422310 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.429215 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487195 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487265 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487391 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487541 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487589 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr5l7\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487670 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487708 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.487772 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca\") pod \"14600350-80fe-4397-8fd8-02c6139cd9d6\" (UID: \"14600350-80fe-4397-8fd8-02c6139cd9d6\") " Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.488040 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.488109 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.488241 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvxg\" (UniqueName: \"kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.490580 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.490612 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.492600 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.492782 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7" (OuterVolumeSpecName: "kube-api-access-sr5l7") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "kube-api-access-sr5l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.493437 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.502852 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.503218 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.516376 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14600350-80fe-4397-8fd8-02c6139cd9d6" (UID: "14600350-80fe-4397-8fd8-02c6139cd9d6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.517619 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.589865 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvxg\" (UniqueName: \"kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.590142 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.590201 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.590771 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.590888 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.590275 4959 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593377 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr5l7\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-kube-api-access-sr5l7\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593391 4959 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14600350-80fe-4397-8fd8-02c6139cd9d6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593401 4959 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14600350-80fe-4397-8fd8-02c6139cd9d6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593468 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593480 4959 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.593516 4959 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14600350-80fe-4397-8fd8-02c6139cd9d6-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.618585 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvxg\" (UniqueName: \"kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg\") pod \"certified-operators-ml74p\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.715744 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2p6jh"] Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.776333 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.943058 4959 generic.go:334] "Generic (PLEG): container finished" podID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerID="97d5d5aa99b2f363082c35d1c7fe7ee06d8eccd3497622f5bf838ea6f0e508f7" exitCode=0 Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.943141 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerDied","Data":"97d5d5aa99b2f363082c35d1c7fe7ee06d8eccd3497622f5bf838ea6f0e508f7"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.954219 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kk98t" event={"ID":"6f310322-04af-455b-9cbc-d49bd49aec71","Type":"ContainerStarted","Data":"dc2dac04fb0c754e9922cba22433271c6897d3b3c68524a4e5c2c35090e52115"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.957846 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" event={"ID":"14600350-80fe-4397-8fd8-02c6139cd9d6","Type":"ContainerDied","Data":"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.957933 4959 scope.go:117] "RemoveContainer" containerID="e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.959051 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.962694 4959 generic.go:334] "Generic (PLEG): container finished" podID="14600350-80fe-4397-8fd8-02c6139cd9d6" containerID="e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69" exitCode=0 Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.962871 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4wfvg" event={"ID":"14600350-80fe-4397-8fd8-02c6139cd9d6","Type":"ContainerDied","Data":"ebd71a1cb9657aaf72e2933e19e2b90677f907ce12e355dca412d4b73ef2a8e6"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.973058 4959 generic.go:334] "Generic (PLEG): container finished" podID="0714844d-6d03-4a23-9611-e02495624e6d" containerID="ae7e8262e629c2bcdfa4b53d3c6128b12fb36503626c9ea549d58d0133fb2f1c" exitCode=0 Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.973127 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p6jh" event={"ID":"0714844d-6d03-4a23-9611-e02495624e6d","Type":"ContainerDied","Data":"ae7e8262e629c2bcdfa4b53d3c6128b12fb36503626c9ea549d58d0133fb2f1c"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.973165 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p6jh" event={"ID":"0714844d-6d03-4a23-9611-e02495624e6d","Type":"ContainerStarted","Data":"f8c21c25f8ee5371869cd3173c6b0fb5d8c0eca86b11f7ba5fe469710a0488b5"} Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.988857 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kk98t" podStartSLOduration=2.531847945 podStartE2EDuration="3.988835881s" podCreationTimestamp="2025-10-07 13:05:36 +0000 UTC" firstStartedPulling="2025-10-07 13:05:37.933977997 +0000 UTC m=+290.094700674" lastFinishedPulling="2025-10-07 13:05:39.390965933 +0000 UTC m=+291.551688610" observedRunningTime="2025-10-07 13:05:39.984106399 +0000 UTC m=+292.144829076" watchObservedRunningTime="2025-10-07 13:05:39.988835881 +0000 UTC m=+292.149558558" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.996735 4959 scope.go:117] "RemoveContainer" containerID="e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69" Oct 07 13:05:39 crc kubenswrapper[4959]: E1007 13:05:39.997553 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69\": container with ID starting with e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69 not found: ID does not exist" containerID="e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69" Oct 07 13:05:39 crc kubenswrapper[4959]: I1007 13:05:39.997587 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69"} err="failed to get container status \"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69\": rpc error: code = NotFound desc = could not find container \"e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69\": container with ID starting with e2c9f41c0fc95710eab36dbd7cd9f5a7d60fff612e48ada6a42cb1b4b27a4a69 not found: ID does not exist" Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.034780 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.037778 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4wfvg"] Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.172396 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.816702 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14600350-80fe-4397-8fd8-02c6139cd9d6" path="/var/lib/kubelet/pods/14600350-80fe-4397-8fd8-02c6139cd9d6/volumes" Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.979840 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerStarted","Data":"8602481403ace8c53d45347365090f3ca2ef82c8dea1d645828175c5eb3eebb7"} Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.982313 4959 generic.go:334] "Generic (PLEG): container finished" podID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerID="6946fc76285b7bf8c649a2d656f94b4c754f3059074420363f25545a8372cb7f" exitCode=0 Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.982443 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerDied","Data":"6946fc76285b7bf8c649a2d656f94b4c754f3059074420363f25545a8372cb7f"} Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.982490 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerStarted","Data":"43109ff57e0030e5059e71db4a5c392bdf2fd7b360b304086769be94e0f83472"} Oct 07 13:05:40 crc kubenswrapper[4959]: I1007 13:05:40.996608 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pf2tx" podStartSLOduration=2.450478072 podStartE2EDuration="4.996587529s" podCreationTimestamp="2025-10-07 13:05:36 +0000 UTC" firstStartedPulling="2025-10-07 13:05:37.926801125 +0000 UTC m=+290.087523802" lastFinishedPulling="2025-10-07 13:05:40.472910582 +0000 UTC m=+292.633633259" observedRunningTime="2025-10-07 13:05:40.996181246 +0000 UTC m=+293.156903943" watchObservedRunningTime="2025-10-07 13:05:40.996587529 +0000 UTC m=+293.157310206" Oct 07 13:05:42 crc kubenswrapper[4959]: I1007 13:05:42.014135 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerStarted","Data":"1798afb9f148632e03763b4dc16f47a4957be131ece10e18b6745665409d54df"} Oct 07 13:05:42 crc kubenswrapper[4959]: I1007 13:05:42.019815 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p6jh" event={"ID":"0714844d-6d03-4a23-9611-e02495624e6d","Type":"ContainerStarted","Data":"32ec4814157ec1956487945bcbc769b059ac115bd7f4a946ce5df3e96b796def"} Oct 07 13:05:43 crc kubenswrapper[4959]: I1007 13:05:43.027128 4959 generic.go:334] "Generic (PLEG): container finished" podID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerID="1798afb9f148632e03763b4dc16f47a4957be131ece10e18b6745665409d54df" exitCode=0 Oct 07 13:05:43 crc kubenswrapper[4959]: I1007 13:05:43.027199 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerDied","Data":"1798afb9f148632e03763b4dc16f47a4957be131ece10e18b6745665409d54df"} Oct 07 13:05:43 crc kubenswrapper[4959]: I1007 13:05:43.029546 4959 generic.go:334] "Generic (PLEG): container finished" podID="0714844d-6d03-4a23-9611-e02495624e6d" containerID="32ec4814157ec1956487945bcbc769b059ac115bd7f4a946ce5df3e96b796def" exitCode=0 Oct 07 13:05:43 crc kubenswrapper[4959]: I1007 13:05:43.029590 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p6jh" event={"ID":"0714844d-6d03-4a23-9611-e02495624e6d","Type":"ContainerDied","Data":"32ec4814157ec1956487945bcbc769b059ac115bd7f4a946ce5df3e96b796def"} Oct 07 13:05:44 crc kubenswrapper[4959]: I1007 13:05:44.037449 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p6jh" event={"ID":"0714844d-6d03-4a23-9611-e02495624e6d","Type":"ContainerStarted","Data":"2299ddb1ecdb717da9faea6318844f0a18d77da21ec838eb24f7b484b0371832"} Oct 07 13:05:44 crc kubenswrapper[4959]: I1007 13:05:44.039382 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerStarted","Data":"5921b06abe7189b11590b982572efc26959577349eda6853bc99a729737653cc"} Oct 07 13:05:44 crc kubenswrapper[4959]: I1007 13:05:44.076077 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2p6jh" podStartSLOduration=1.5052815750000001 podStartE2EDuration="5.076059465s" podCreationTimestamp="2025-10-07 13:05:39 +0000 UTC" firstStartedPulling="2025-10-07 13:05:39.977601399 +0000 UTC m=+292.138324076" lastFinishedPulling="2025-10-07 13:05:43.548379289 +0000 UTC m=+295.709101966" observedRunningTime="2025-10-07 13:05:44.059405238 +0000 UTC m=+296.220127915" watchObservedRunningTime="2025-10-07 13:05:44.076059465 +0000 UTC m=+296.236782142" Oct 07 13:05:44 crc kubenswrapper[4959]: I1007 13:05:44.076182 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ml74p" podStartSLOduration=2.573804072 podStartE2EDuration="5.076178889s" podCreationTimestamp="2025-10-07 13:05:39 +0000 UTC" firstStartedPulling="2025-10-07 13:05:40.98481778 +0000 UTC m=+293.145540457" lastFinishedPulling="2025-10-07 13:05:43.487192597 +0000 UTC m=+295.647915274" observedRunningTime="2025-10-07 13:05:44.073825953 +0000 UTC m=+296.234548650" watchObservedRunningTime="2025-10-07 13:05:44.076178889 +0000 UTC m=+296.236901566" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.127226 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.127471 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.170830 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.329805 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.329876 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:47 crc kubenswrapper[4959]: I1007 13:05:47.370085 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:48 crc kubenswrapper[4959]: I1007 13:05:48.095903 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 13:05:48 crc kubenswrapper[4959]: I1007 13:05:48.122216 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kk98t" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.517820 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.518187 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.561573 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.776638 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.777060 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:49 crc kubenswrapper[4959]: I1007 13:05:49.812781 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:05:50 crc kubenswrapper[4959]: I1007 13:05:50.109945 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2p6jh" Oct 07 13:05:50 crc kubenswrapper[4959]: I1007 13:05:50.110888 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 13:07:07 crc kubenswrapper[4959]: I1007 13:07:07.695367 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:07:07 crc kubenswrapper[4959]: I1007 13:07:07.696093 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:07:37 crc kubenswrapper[4959]: I1007 13:07:37.695707 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:07:37 crc kubenswrapper[4959]: I1007 13:07:37.696267 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.695991 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.696661 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.696725 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.697484 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.697570 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809" gracePeriod=600 Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.905540 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809" exitCode=0 Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.905598 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809"} Oct 07 13:08:07 crc kubenswrapper[4959]: I1007 13:08:07.905659 4959 scope.go:117] "RemoveContainer" containerID="c58024fd915f59dd9e888ccdd40407fda126b287a079de29ffcf17740d29125f" Oct 07 13:08:08 crc kubenswrapper[4959]: I1007 13:08:08.912012 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0"} Oct 07 13:10:07 crc kubenswrapper[4959]: I1007 13:10:07.695682 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:10:07 crc kubenswrapper[4959]: I1007 13:10:07.696767 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:10:37 crc kubenswrapper[4959]: I1007 13:10:37.695305 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:10:37 crc kubenswrapper[4959]: I1007 13:10:37.695852 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.411660 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-585vd"] Oct 07 13:10:49 crc kubenswrapper[4959]: E1007 13:10:49.412565 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14600350-80fe-4397-8fd8-02c6139cd9d6" containerName="registry" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.412577 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="14600350-80fe-4397-8fd8-02c6139cd9d6" containerName="registry" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.412699 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="14600350-80fe-4397-8fd8-02c6139cd9d6" containerName="registry" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.413061 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.413921 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-d6468"] Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.414653 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-d6468" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.415673 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.415818 4959 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n4qz7" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.415915 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.416102 4959 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7t688" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.416956 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-585vd"] Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.423888 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-d6468"] Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.447916 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hp7q8"] Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.448601 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.450377 4959 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qdxvd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.451324 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hp7q8"] Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.508694 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x958\" (UniqueName: \"kubernetes.io/projected/897ad114-2a60-468e-8c81-2367ded7fe7b-kube-api-access-4x958\") pod \"cert-manager-cainjector-7f985d654d-585vd\" (UID: \"897ad114-2a60-468e-8c81-2367ded7fe7b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.508733 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ddt6\" (UniqueName: \"kubernetes.io/projected/617c6991-922b-4bd1-b578-2327061ba973-kube-api-access-5ddt6\") pod \"cert-manager-webhook-5655c58dd6-hp7q8\" (UID: \"617c6991-922b-4bd1-b578-2327061ba973\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.508764 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2l6\" (UniqueName: \"kubernetes.io/projected/fe93de9f-3c30-4373-bc80-912dd219d1f9-kube-api-access-mf2l6\") pod \"cert-manager-5b446d88c5-d6468\" (UID: \"fe93de9f-3c30-4373-bc80-912dd219d1f9\") " pod="cert-manager/cert-manager-5b446d88c5-d6468" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.609662 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x958\" (UniqueName: \"kubernetes.io/projected/897ad114-2a60-468e-8c81-2367ded7fe7b-kube-api-access-4x958\") pod \"cert-manager-cainjector-7f985d654d-585vd\" (UID: \"897ad114-2a60-468e-8c81-2367ded7fe7b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.609702 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ddt6\" (UniqueName: \"kubernetes.io/projected/617c6991-922b-4bd1-b578-2327061ba973-kube-api-access-5ddt6\") pod \"cert-manager-webhook-5655c58dd6-hp7q8\" (UID: \"617c6991-922b-4bd1-b578-2327061ba973\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.609731 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2l6\" (UniqueName: \"kubernetes.io/projected/fe93de9f-3c30-4373-bc80-912dd219d1f9-kube-api-access-mf2l6\") pod \"cert-manager-5b446d88c5-d6468\" (UID: \"fe93de9f-3c30-4373-bc80-912dd219d1f9\") " pod="cert-manager/cert-manager-5b446d88c5-d6468" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.627018 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2l6\" (UniqueName: \"kubernetes.io/projected/fe93de9f-3c30-4373-bc80-912dd219d1f9-kube-api-access-mf2l6\") pod \"cert-manager-5b446d88c5-d6468\" (UID: \"fe93de9f-3c30-4373-bc80-912dd219d1f9\") " pod="cert-manager/cert-manager-5b446d88c5-d6468" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.630552 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ddt6\" (UniqueName: \"kubernetes.io/projected/617c6991-922b-4bd1-b578-2327061ba973-kube-api-access-5ddt6\") pod \"cert-manager-webhook-5655c58dd6-hp7q8\" (UID: \"617c6991-922b-4bd1-b578-2327061ba973\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.635334 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x958\" (UniqueName: \"kubernetes.io/projected/897ad114-2a60-468e-8c81-2367ded7fe7b-kube-api-access-4x958\") pod \"cert-manager-cainjector-7f985d654d-585vd\" (UID: \"897ad114-2a60-468e-8c81-2367ded7fe7b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.730078 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.739379 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-d6468" Oct 07 13:10:49 crc kubenswrapper[4959]: I1007 13:10:49.764256 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.006609 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-585vd"] Oct 07 13:10:50 crc kubenswrapper[4959]: W1007 13:10:50.017344 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod897ad114_2a60_468e_8c81_2367ded7fe7b.slice/crio-c2410da5d00104a86257f3544e93ba91648ea7af652ef9f3fd261e7a191a237c WatchSource:0}: Error finding container c2410da5d00104a86257f3544e93ba91648ea7af652ef9f3fd261e7a191a237c: Status 404 returned error can't find the container with id c2410da5d00104a86257f3544e93ba91648ea7af652ef9f3fd261e7a191a237c Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.023430 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.051089 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hp7q8"] Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.198492 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-d6468"] Oct 07 13:10:50 crc kubenswrapper[4959]: W1007 13:10:50.205426 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe93de9f_3c30_4373_bc80_912dd219d1f9.slice/crio-b248db69b0bd1dcd401945a9e31e3b31354e322aae441db7d286e0a00ea1982e WatchSource:0}: Error finding container b248db69b0bd1dcd401945a9e31e3b31354e322aae441db7d286e0a00ea1982e: Status 404 returned error can't find the container with id b248db69b0bd1dcd401945a9e31e3b31354e322aae441db7d286e0a00ea1982e Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.780386 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" event={"ID":"897ad114-2a60-468e-8c81-2367ded7fe7b","Type":"ContainerStarted","Data":"c2410da5d00104a86257f3544e93ba91648ea7af652ef9f3fd261e7a191a237c"} Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.781162 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-d6468" event={"ID":"fe93de9f-3c30-4373-bc80-912dd219d1f9","Type":"ContainerStarted","Data":"b248db69b0bd1dcd401945a9e31e3b31354e322aae441db7d286e0a00ea1982e"} Oct 07 13:10:50 crc kubenswrapper[4959]: I1007 13:10:50.781906 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" event={"ID":"617c6991-922b-4bd1-b578-2327061ba973","Type":"ContainerStarted","Data":"12aa4a38548875bbecda2bbe95816b064b23d72deaffdf0322b5dbe230c177df"} Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.796663 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" event={"ID":"617c6991-922b-4bd1-b578-2327061ba973","Type":"ContainerStarted","Data":"6e187e8f62d9ca8243a3cd5f68195615ac8f9d302c9b759b46768f51ed88ee08"} Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.798529 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.800477 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" event={"ID":"897ad114-2a60-468e-8c81-2367ded7fe7b","Type":"ContainerStarted","Data":"7bfd8bdc8cb8b85d4be865f21cbc7c9901cec8e79b252031f9400ace73c7452f"} Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.802013 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-d6468" event={"ID":"fe93de9f-3c30-4373-bc80-912dd219d1f9","Type":"ContainerStarted","Data":"5c8af0d0cd5bde3f379637edfb6e314efb309f2e181ea047314875309d0908cd"} Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.812693 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" podStartSLOduration=1.5649243099999999 podStartE2EDuration="4.81267258s" podCreationTimestamp="2025-10-07 13:10:49 +0000 UTC" firstStartedPulling="2025-10-07 13:10:50.056470231 +0000 UTC m=+602.217192908" lastFinishedPulling="2025-10-07 13:10:53.304218501 +0000 UTC m=+605.464941178" observedRunningTime="2025-10-07 13:10:53.811061073 +0000 UTC m=+605.971783780" watchObservedRunningTime="2025-10-07 13:10:53.81267258 +0000 UTC m=+605.973395257" Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.829301 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-d6468" podStartSLOduration=1.675736074 podStartE2EDuration="4.82928428s" podCreationTimestamp="2025-10-07 13:10:49 +0000 UTC" firstStartedPulling="2025-10-07 13:10:50.207658832 +0000 UTC m=+602.368381509" lastFinishedPulling="2025-10-07 13:10:53.361207038 +0000 UTC m=+605.521929715" observedRunningTime="2025-10-07 13:10:53.828838597 +0000 UTC m=+605.989561334" watchObservedRunningTime="2025-10-07 13:10:53.82928428 +0000 UTC m=+605.990006957" Oct 07 13:10:53 crc kubenswrapper[4959]: I1007 13:10:53.852795 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-585vd" podStartSLOduration=1.572059956 podStartE2EDuration="4.852773299s" podCreationTimestamp="2025-10-07 13:10:49 +0000 UTC" firstStartedPulling="2025-10-07 13:10:50.023165358 +0000 UTC m=+602.183888035" lastFinishedPulling="2025-10-07 13:10:53.303878681 +0000 UTC m=+605.464601378" observedRunningTime="2025-10-07 13:10:53.852178182 +0000 UTC m=+606.012900859" watchObservedRunningTime="2025-10-07 13:10:53.852773299 +0000 UTC m=+606.013495976" Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.768179 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-hp7q8" Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948199 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfm8k"] Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948528 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-controller" containerID="cri-o://ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948589 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="nbdb" containerID="cri-o://c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948685 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-acl-logging" containerID="cri-o://b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948666 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948674 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-node" containerID="cri-o://3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.948655 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="northd" containerID="cri-o://74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.949009 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="sbdb" containerID="cri-o://7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" gracePeriod=30 Oct 07 13:10:59 crc kubenswrapper[4959]: I1007 13:10:59.980871 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" containerID="cri-o://ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" gracePeriod=30 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.224838 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/3.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.226432 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovn-acl-logging/0.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.226942 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovn-controller/0.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.227300 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.271940 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wvrqm"] Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272129 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272142 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272150 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272156 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272163 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="northd" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272170 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="northd" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272183 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-acl-logging" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272189 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-acl-logging" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272197 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272202 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272210 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-node" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272217 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-node" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272264 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="sbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272271 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="sbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272281 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272287 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272297 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272303 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272311 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kubecfg-setup" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272317 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kubecfg-setup" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272325 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272331 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272339 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="nbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272344 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="nbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272431 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272439 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272448 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272455 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272462 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovn-acl-logging" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272469 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-node" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272478 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="nbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272486 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="sbdb" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272492 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272499 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="northd" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.272582 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272589 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272687 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.272697 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerName="ovnkube-controller" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.274181 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344747 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344805 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344831 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344880 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344918 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344944 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344957 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344918 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.344958 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345018 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345266 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345292 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345318 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345342 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345352 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345374 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345392 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345425 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77tdh\" (UniqueName: \"kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345442 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345458 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345481 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345502 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345518 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345546 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345560 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345585 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch\") pod \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\" (UID: \"b26fd9a1-4343-4f1c-bef3-764d3c74724a\") " Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345820 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-var-lib-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345841 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-slash\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345859 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-config\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345873 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/555fd0fb-24f2-416b-8562-b400d07b5335-ovn-node-metrics-cert\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345891 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-script-lib\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345897 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345910 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-systemd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345938 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345923 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345951 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345966 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346032 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346047 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346058 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash" (OuterVolumeSpecName: "host-slash") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346088 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket" (OuterVolumeSpecName: "log-socket") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346149 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-kubelet\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346207 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346236 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.345900 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log" (OuterVolumeSpecName: "node-log") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346310 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-log-socket\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346338 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkt55\" (UniqueName: \"kubernetes.io/projected/555fd0fb-24f2-416b-8562-b400d07b5335-kube-api-access-pkt55\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346355 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-systemd-units\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346375 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-node-log\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346393 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346410 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-netns\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346429 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-ovn\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346448 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-bin\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346592 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346665 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-netd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346702 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-env-overrides\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346792 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-etc-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346870 4959 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346886 4959 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346898 4959 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346909 4959 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346923 4959 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.346972 4959 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347002 4959 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347021 4959 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347038 4959 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347060 4959 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347086 4959 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347108 4959 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347135 4959 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347154 4959 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347171 4959 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347188 4959 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.347205 4959 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.351529 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh" (OuterVolumeSpecName: "kube-api-access-77tdh") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "kube-api-access-77tdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.351564 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.359137 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b26fd9a1-4343-4f1c-bef3-764d3c74724a" (UID: "b26fd9a1-4343-4f1c-bef3-764d3c74724a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448486 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-node-log\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448537 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-netns\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448558 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448579 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-ovn\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448597 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-bin\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448616 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448652 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-netd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448672 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-env-overrides\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448692 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-etc-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448774 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-ovn\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448818 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-etc-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448808 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448851 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-node-log\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448718 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-var-lib-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448852 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-netd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448878 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-netns\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448915 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-slash\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448893 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-var-lib-openvswitch\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448809 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.448985 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-config\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449006 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-slash\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449041 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/555fd0fb-24f2-416b-8562-b400d07b5335-ovn-node-metrics-cert\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449060 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-script-lib\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449078 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-systemd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449107 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-kubelet\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449124 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449147 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-log-socket\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449168 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkt55\" (UniqueName: \"kubernetes.io/projected/555fd0fb-24f2-416b-8562-b400d07b5335-kube-api-access-pkt55\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449202 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-systemd-units\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449243 4959 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b26fd9a1-4343-4f1c-bef3-764d3c74724a-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449257 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77tdh\" (UniqueName: \"kubernetes.io/projected/b26fd9a1-4343-4f1c-bef3-764d3c74724a-kube-api-access-77tdh\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449269 4959 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b26fd9a1-4343-4f1c-bef3-764d3c74724a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449257 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-run-systemd\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449302 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-kubelet\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449323 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449344 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-log-socket\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449485 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-env-overrides\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449557 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-systemd-units\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.449608 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/555fd0fb-24f2-416b-8562-b400d07b5335-host-cni-bin\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.450234 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-script-lib\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.450398 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/555fd0fb-24f2-416b-8562-b400d07b5335-ovnkube-config\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.453715 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/555fd0fb-24f2-416b-8562-b400d07b5335-ovn-node-metrics-cert\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.464656 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkt55\" (UniqueName: \"kubernetes.io/projected/555fd0fb-24f2-416b-8562-b400d07b5335-kube-api-access-pkt55\") pod \"ovnkube-node-wvrqm\" (UID: \"555fd0fb-24f2-416b-8562-b400d07b5335\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.589927 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:00 crc kubenswrapper[4959]: W1007 13:11:00.614672 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555fd0fb_24f2_416b_8562_b400d07b5335.slice/crio-839b30217d9b197242b10e3ecfc4fb62921e4a3ce309d5f57aa7a302a764d448 WatchSource:0}: Error finding container 839b30217d9b197242b10e3ecfc4fb62921e4a3ce309d5f57aa7a302a764d448: Status 404 returned error can't find the container with id 839b30217d9b197242b10e3ecfc4fb62921e4a3ce309d5f57aa7a302a764d448 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.838801 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/2.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.839470 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/1.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.839526 4959 generic.go:334] "Generic (PLEG): container finished" podID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" containerID="be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4" exitCode=2 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.839675 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerDied","Data":"be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.839799 4959 scope.go:117] "RemoveContainer" containerID="6ebbd87d361ada172f37589560b8bbad1527d29ca09a5dccf608e7e5e603fbae" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.840593 4959 scope.go:117] "RemoveContainer" containerID="be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4" Oct 07 13:11:00 crc kubenswrapper[4959]: E1007 13:11:00.841210 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-b2pc7_openshift-multus(07e132b2-5c1c-488e-abf4-bdaf3fcf4f93)\"" pod="openshift-multus/multus-b2pc7" podUID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.848045 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovnkube-controller/3.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.855326 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovn-acl-logging/0.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856130 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jfm8k_b26fd9a1-4343-4f1c-bef3-764d3c74724a/ovn-controller/0.log" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856645 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856677 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856689 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856703 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856716 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856727 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856737 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" exitCode=143 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856748 4959 generic.go:334] "Generic (PLEG): container finished" podID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" exitCode=143 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856893 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856947 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856970 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856991 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857012 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857033 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857054 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857071 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857082 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857094 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857104 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857154 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857166 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857179 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857190 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857200 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857216 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857234 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857246 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857257 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857267 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857279 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857289 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857300 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857311 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857321 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857332 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857348 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857364 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857376 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857387 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857397 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857409 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857419 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857429 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857444 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857454 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857467 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857481 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" event={"ID":"b26fd9a1-4343-4f1c-bef3-764d3c74724a","Type":"ContainerDied","Data":"bbd780aded1d877ecf4e1cd985d32ca3597d0c60c31c75f13ee41e23a00d0112"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857497 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857510 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857521 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857531 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857542 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857553 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857563 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857573 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857583 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.857593 4959 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.856908 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfm8k" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.867274 4959 generic.go:334] "Generic (PLEG): container finished" podID="555fd0fb-24f2-416b-8562-b400d07b5335" containerID="f6bb1aac971cbfeafbf099077f90919da3bd99158c29f80daa20f84c71511cb0" exitCode=0 Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.867406 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerDied","Data":"f6bb1aac971cbfeafbf099077f90919da3bd99158c29f80daa20f84c71511cb0"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.867493 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"839b30217d9b197242b10e3ecfc4fb62921e4a3ce309d5f57aa7a302a764d448"} Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.896481 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfm8k"] Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.896948 4959 scope.go:117] "RemoveContainer" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.901689 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfm8k"] Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.930982 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.963735 4959 scope.go:117] "RemoveContainer" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.978043 4959 scope.go:117] "RemoveContainer" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:00 crc kubenswrapper[4959]: I1007 13:11:00.999726 4959 scope.go:117] "RemoveContainer" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.016774 4959 scope.go:117] "RemoveContainer" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.034803 4959 scope.go:117] "RemoveContainer" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.054979 4959 scope.go:117] "RemoveContainer" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.089572 4959 scope.go:117] "RemoveContainer" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.107728 4959 scope.go:117] "RemoveContainer" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.128920 4959 scope.go:117] "RemoveContainer" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.129462 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": container with ID starting with ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185 not found: ID does not exist" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.129501 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} err="failed to get container status \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": rpc error: code = NotFound desc = could not find container \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": container with ID starting with ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.129533 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.130147 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": container with ID starting with 27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20 not found: ID does not exist" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130185 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} err="failed to get container status \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": rpc error: code = NotFound desc = could not find container \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": container with ID starting with 27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130216 4959 scope.go:117] "RemoveContainer" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.130571 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": container with ID starting with 7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972 not found: ID does not exist" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130591 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} err="failed to get container status \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": rpc error: code = NotFound desc = could not find container \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": container with ID starting with 7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130604 4959 scope.go:117] "RemoveContainer" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.130906 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": container with ID starting with c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9 not found: ID does not exist" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130944 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} err="failed to get container status \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": rpc error: code = NotFound desc = could not find container \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": container with ID starting with c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.130974 4959 scope.go:117] "RemoveContainer" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.131340 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": container with ID starting with 74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1 not found: ID does not exist" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131364 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} err="failed to get container status \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": rpc error: code = NotFound desc = could not find container \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": container with ID starting with 74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131380 4959 scope.go:117] "RemoveContainer" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.131635 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": container with ID starting with cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895 not found: ID does not exist" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131662 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} err="failed to get container status \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": rpc error: code = NotFound desc = could not find container \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": container with ID starting with cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131680 4959 scope.go:117] "RemoveContainer" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.131918 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": container with ID starting with 3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356 not found: ID does not exist" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131943 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} err="failed to get container status \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": rpc error: code = NotFound desc = could not find container \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": container with ID starting with 3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.131956 4959 scope.go:117] "RemoveContainer" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.132229 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": container with ID starting with b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2 not found: ID does not exist" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132248 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} err="failed to get container status \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": rpc error: code = NotFound desc = could not find container \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": container with ID starting with b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132259 4959 scope.go:117] "RemoveContainer" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.132468 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": container with ID starting with ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83 not found: ID does not exist" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132491 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} err="failed to get container status \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": rpc error: code = NotFound desc = could not find container \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": container with ID starting with ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132505 4959 scope.go:117] "RemoveContainer" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: E1007 13:11:01.132744 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": container with ID starting with c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008 not found: ID does not exist" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132764 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} err="failed to get container status \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": rpc error: code = NotFound desc = could not find container \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": container with ID starting with c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.132778 4959 scope.go:117] "RemoveContainer" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133106 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} err="failed to get container status \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": rpc error: code = NotFound desc = could not find container \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": container with ID starting with ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133128 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133387 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} err="failed to get container status \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": rpc error: code = NotFound desc = could not find container \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": container with ID starting with 27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133429 4959 scope.go:117] "RemoveContainer" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133717 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} err="failed to get container status \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": rpc error: code = NotFound desc = could not find container \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": container with ID starting with 7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133737 4959 scope.go:117] "RemoveContainer" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133963 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} err="failed to get container status \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": rpc error: code = NotFound desc = could not find container \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": container with ID starting with c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.133992 4959 scope.go:117] "RemoveContainer" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134212 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} err="failed to get container status \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": rpc error: code = NotFound desc = could not find container \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": container with ID starting with 74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134231 4959 scope.go:117] "RemoveContainer" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134444 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} err="failed to get container status \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": rpc error: code = NotFound desc = could not find container \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": container with ID starting with cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134464 4959 scope.go:117] "RemoveContainer" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134715 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} err="failed to get container status \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": rpc error: code = NotFound desc = could not find container \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": container with ID starting with 3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134733 4959 scope.go:117] "RemoveContainer" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134948 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} err="failed to get container status \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": rpc error: code = NotFound desc = could not find container \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": container with ID starting with b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.134970 4959 scope.go:117] "RemoveContainer" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135192 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} err="failed to get container status \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": rpc error: code = NotFound desc = could not find container \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": container with ID starting with ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135211 4959 scope.go:117] "RemoveContainer" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135456 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} err="failed to get container status \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": rpc error: code = NotFound desc = could not find container \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": container with ID starting with c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135480 4959 scope.go:117] "RemoveContainer" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135740 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} err="failed to get container status \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": rpc error: code = NotFound desc = could not find container \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": container with ID starting with ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.135769 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136050 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} err="failed to get container status \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": rpc error: code = NotFound desc = could not find container \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": container with ID starting with 27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136076 4959 scope.go:117] "RemoveContainer" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136353 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} err="failed to get container status \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": rpc error: code = NotFound desc = could not find container \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": container with ID starting with 7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136375 4959 scope.go:117] "RemoveContainer" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136641 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} err="failed to get container status \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": rpc error: code = NotFound desc = could not find container \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": container with ID starting with c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136671 4959 scope.go:117] "RemoveContainer" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136931 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} err="failed to get container status \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": rpc error: code = NotFound desc = could not find container \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": container with ID starting with 74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.136954 4959 scope.go:117] "RemoveContainer" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137205 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} err="failed to get container status \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": rpc error: code = NotFound desc = could not find container \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": container with ID starting with cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137228 4959 scope.go:117] "RemoveContainer" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137481 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} err="failed to get container status \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": rpc error: code = NotFound desc = could not find container \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": container with ID starting with 3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137503 4959 scope.go:117] "RemoveContainer" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137756 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} err="failed to get container status \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": rpc error: code = NotFound desc = could not find container \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": container with ID starting with b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.137778 4959 scope.go:117] "RemoveContainer" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138047 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} err="failed to get container status \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": rpc error: code = NotFound desc = could not find container \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": container with ID starting with ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138066 4959 scope.go:117] "RemoveContainer" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138279 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} err="failed to get container status \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": rpc error: code = NotFound desc = could not find container \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": container with ID starting with c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138304 4959 scope.go:117] "RemoveContainer" containerID="ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138506 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185"} err="failed to get container status \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": rpc error: code = NotFound desc = could not find container \"ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185\": container with ID starting with ae16227a85db63c0eca3a0a69680d2011c1fa6406e6ef5aa48198a549eb68185 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138526 4959 scope.go:117] "RemoveContainer" containerID="27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138726 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20"} err="failed to get container status \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": rpc error: code = NotFound desc = could not find container \"27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20\": container with ID starting with 27cbb61841c8e77812e9c510a46c962f3fc6d6e228ccd59a001ef12d66724e20 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138746 4959 scope.go:117] "RemoveContainer" containerID="7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138926 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972"} err="failed to get container status \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": rpc error: code = NotFound desc = could not find container \"7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972\": container with ID starting with 7aa85ad80baebad63a309efdb27236b5d9369202c8f764a8fc99eca88fba6972 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.138954 4959 scope.go:117] "RemoveContainer" containerID="c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139155 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9"} err="failed to get container status \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": rpc error: code = NotFound desc = could not find container \"c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9\": container with ID starting with c792e5ff8e58d9a78c4e9db9beac0f696dcfa4d88b24494a1188da891ea75be9 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139177 4959 scope.go:117] "RemoveContainer" containerID="74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139350 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1"} err="failed to get container status \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": rpc error: code = NotFound desc = could not find container \"74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1\": container with ID starting with 74ee8c4bc5a966a1c009bd4eb82a19bf34f2177de14b3becaa889e31d71ce4b1 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139368 4959 scope.go:117] "RemoveContainer" containerID="cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139553 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895"} err="failed to get container status \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": rpc error: code = NotFound desc = could not find container \"cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895\": container with ID starting with cf9e77c84670f8be0eb87a308bcaf1a2ed5f9e33e14317b7d035979ea51cf895 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139575 4959 scope.go:117] "RemoveContainer" containerID="3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139768 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356"} err="failed to get container status \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": rpc error: code = NotFound desc = could not find container \"3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356\": container with ID starting with 3c9094eced43b70b6be60dc444eb895528f304f8c6f49797b83bee13aa43f356 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139793 4959 scope.go:117] "RemoveContainer" containerID="b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.139988 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2"} err="failed to get container status \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": rpc error: code = NotFound desc = could not find container \"b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2\": container with ID starting with b993addabcbd9e729e6a23a997aa53927ebac60244a5bd94a797765d63b664e2 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.140006 4959 scope.go:117] "RemoveContainer" containerID="ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.140192 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83"} err="failed to get container status \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": rpc error: code = NotFound desc = could not find container \"ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83\": container with ID starting with ef7c57b79b0bf35c859066f368d056b6453c9a364499edb3ceaae9dd25ee6a83 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.140211 4959 scope.go:117] "RemoveContainer" containerID="c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.140395 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008"} err="failed to get container status \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": rpc error: code = NotFound desc = could not find container \"c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008\": container with ID starting with c4c47636502824bce8df8def75bc76cd71386168791bc66019fea419b803d008 not found: ID does not exist" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.873699 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/2.log" Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878714 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"73c7b1ae1af52e2666eda450e5e088a62d660651cf92b75d0d734e00bb5cfc4f"} Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878753 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"32600c9f45559ca8aedba26bdc47194b28a11a567bc889df039e7ab4096dae98"} Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878764 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"67bde2871858b2cc7e4ab96b3d26a9868d38e54554c70436e19057e3fb76805e"} Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878772 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"1772ea10f3123a51972c055955879d4659bc89c89b8fd85de7810304bae6cf4c"} Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878781 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"26e6faef15a07aa333e756ef5cf4fd40246f9c5ebba475cebf6c6c20c250969e"} Oct 07 13:11:01 crc kubenswrapper[4959]: I1007 13:11:01.878797 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"0667970fb835095220e209c9a16ea70de2de976f86e5c84a5c2cb6af2a89fe37"} Oct 07 13:11:02 crc kubenswrapper[4959]: I1007 13:11:02.826391 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26fd9a1-4343-4f1c-bef3-764d3c74724a" path="/var/lib/kubelet/pods/b26fd9a1-4343-4f1c-bef3-764d3c74724a/volumes" Oct 07 13:11:03 crc kubenswrapper[4959]: I1007 13:11:03.894937 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"e571c3779baf5f7532310c37cdd0f3133a482d80f42956f81ecfa4db37e8e0c9"} Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.912423 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" event={"ID":"555fd0fb-24f2-416b-8562-b400d07b5335","Type":"ContainerStarted","Data":"d8d0dda5e877fe93d6c8b9810de5f0be53640660336017de940b6d3cefb7afb3"} Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.912996 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.913014 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.913025 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.937924 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.938216 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:06 crc kubenswrapper[4959]: I1007 13:11:06.955748 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" podStartSLOduration=6.955726535 podStartE2EDuration="6.955726535s" podCreationTimestamp="2025-10-07 13:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:11:06.95036011 +0000 UTC m=+619.111082797" watchObservedRunningTime="2025-10-07 13:11:06.955726535 +0000 UTC m=+619.116449212" Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.696118 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.696437 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.696484 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.697024 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.697091 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0" gracePeriod=600 Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.919709 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0" exitCode=0 Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.919756 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0"} Oct 07 13:11:07 crc kubenswrapper[4959]: I1007 13:11:07.919852 4959 scope.go:117] "RemoveContainer" containerID="cbc8950264097c3d06b2e882262edee6c191ec573bd905f62e8addd40b664809" Oct 07 13:11:08 crc kubenswrapper[4959]: I1007 13:11:08.930941 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee"} Oct 07 13:11:12 crc kubenswrapper[4959]: I1007 13:11:12.808880 4959 scope.go:117] "RemoveContainer" containerID="be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4" Oct 07 13:11:12 crc kubenswrapper[4959]: E1007 13:11:12.809799 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-b2pc7_openshift-multus(07e132b2-5c1c-488e-abf4-bdaf3fcf4f93)\"" pod="openshift-multus/multus-b2pc7" podUID="07e132b2-5c1c-488e-abf4-bdaf3fcf4f93" Oct 07 13:11:24 crc kubenswrapper[4959]: I1007 13:11:24.808508 4959 scope.go:117] "RemoveContainer" containerID="be6fa8893a9af981bea7715b2b6e5dc55dd168d348c042414a85e37d321aecc4" Oct 07 13:11:25 crc kubenswrapper[4959]: I1007 13:11:25.024670 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2pc7_07e132b2-5c1c-488e-abf4-bdaf3fcf4f93/kube-multus/2.log" Oct 07 13:11:25 crc kubenswrapper[4959]: I1007 13:11:25.024725 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2pc7" event={"ID":"07e132b2-5c1c-488e-abf4-bdaf3fcf4f93","Type":"ContainerStarted","Data":"b42522d474212ff3c91a19efe36a33233e297e919817cc8b6a24b4d5487e9655"} Oct 07 13:11:30 crc kubenswrapper[4959]: I1007 13:11:30.615173 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvrqm" Oct 07 13:11:41 crc kubenswrapper[4959]: I1007 13:11:41.937177 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q"] Oct 07 13:11:41 crc kubenswrapper[4959]: I1007 13:11:41.938653 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:41 crc kubenswrapper[4959]: I1007 13:11:41.940451 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 13:11:41 crc kubenswrapper[4959]: I1007 13:11:41.951597 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q"] Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.116782 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fncz\" (UniqueName: \"kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.116870 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.116936 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.217822 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.217902 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fncz\" (UniqueName: \"kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.217965 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.218580 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.218583 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.250236 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fncz\" (UniqueName: \"kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.256308 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:42 crc kubenswrapper[4959]: I1007 13:11:42.466609 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q"] Oct 07 13:11:43 crc kubenswrapper[4959]: I1007 13:11:43.130993 4959 generic.go:334] "Generic (PLEG): container finished" podID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerID="a676eb574b68682efd1a4bed70e7a70fa253b38b7826a1d636779592df3ba8c6" exitCode=0 Oct 07 13:11:43 crc kubenswrapper[4959]: I1007 13:11:43.131038 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" event={"ID":"ba266f6d-03a0-4e6f-b1fc-be71300ce515","Type":"ContainerDied","Data":"a676eb574b68682efd1a4bed70e7a70fa253b38b7826a1d636779592df3ba8c6"} Oct 07 13:11:43 crc kubenswrapper[4959]: I1007 13:11:43.131062 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" event={"ID":"ba266f6d-03a0-4e6f-b1fc-be71300ce515","Type":"ContainerStarted","Data":"5ef99e12de5193d3f6811654d00b44d519f73ecc1f904a9dc618e436ad3be804"} Oct 07 13:11:45 crc kubenswrapper[4959]: I1007 13:11:45.144415 4959 generic.go:334] "Generic (PLEG): container finished" podID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerID="17a20d39db13bc35a6a43157628b77f1d57d8532355ea0aa55a6ece285e82c5d" exitCode=0 Oct 07 13:11:45 crc kubenswrapper[4959]: I1007 13:11:45.144479 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" event={"ID":"ba266f6d-03a0-4e6f-b1fc-be71300ce515","Type":"ContainerDied","Data":"17a20d39db13bc35a6a43157628b77f1d57d8532355ea0aa55a6ece285e82c5d"} Oct 07 13:11:46 crc kubenswrapper[4959]: I1007 13:11:46.155240 4959 generic.go:334] "Generic (PLEG): container finished" podID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerID="24bbb1c3d95f21ccd157387b39c94ec0748579e1b678ee161b2797090187e31b" exitCode=0 Oct 07 13:11:46 crc kubenswrapper[4959]: I1007 13:11:46.155386 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" event={"ID":"ba266f6d-03a0-4e6f-b1fc-be71300ce515","Type":"ContainerDied","Data":"24bbb1c3d95f21ccd157387b39c94ec0748579e1b678ee161b2797090187e31b"} Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.353792 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.479202 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle\") pod \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.479778 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle" (OuterVolumeSpecName: "bundle") pod "ba266f6d-03a0-4e6f-b1fc-be71300ce515" (UID: "ba266f6d-03a0-4e6f-b1fc-be71300ce515"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.480149 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util\") pod \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.480189 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fncz\" (UniqueName: \"kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz\") pod \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\" (UID: \"ba266f6d-03a0-4e6f-b1fc-be71300ce515\") " Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.480381 4959 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.487196 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz" (OuterVolumeSpecName: "kube-api-access-4fncz") pod "ba266f6d-03a0-4e6f-b1fc-be71300ce515" (UID: "ba266f6d-03a0-4e6f-b1fc-be71300ce515"). InnerVolumeSpecName "kube-api-access-4fncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.494045 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util" (OuterVolumeSpecName: "util") pod "ba266f6d-03a0-4e6f-b1fc-be71300ce515" (UID: "ba266f6d-03a0-4e6f-b1fc-be71300ce515"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.581058 4959 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba266f6d-03a0-4e6f-b1fc-be71300ce515-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:47 crc kubenswrapper[4959]: I1007 13:11:47.581097 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fncz\" (UniqueName: \"kubernetes.io/projected/ba266f6d-03a0-4e6f-b1fc-be71300ce515-kube-api-access-4fncz\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:48 crc kubenswrapper[4959]: I1007 13:11:48.168404 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" event={"ID":"ba266f6d-03a0-4e6f-b1fc-be71300ce515","Type":"ContainerDied","Data":"5ef99e12de5193d3f6811654d00b44d519f73ecc1f904a9dc618e436ad3be804"} Oct 07 13:11:48 crc kubenswrapper[4959]: I1007 13:11:48.168447 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q" Oct 07 13:11:48 crc kubenswrapper[4959]: I1007 13:11:48.168455 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef99e12de5193d3f6811654d00b44d519f73ecc1f904a9dc618e436ad3be804" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.834217 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qxq76"] Oct 07 13:11:49 crc kubenswrapper[4959]: E1007 13:11:49.834785 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="util" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.834802 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="util" Oct 07 13:11:49 crc kubenswrapper[4959]: E1007 13:11:49.834817 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="extract" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.834824 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="extract" Oct 07 13:11:49 crc kubenswrapper[4959]: E1007 13:11:49.834833 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="pull" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.834841 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="pull" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.834956 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba266f6d-03a0-4e6f-b1fc-be71300ce515" containerName="extract" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.835406 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.838798 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-q68wd" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.839048 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.839452 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.848463 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qxq76"] Oct 07 13:11:49 crc kubenswrapper[4959]: I1007 13:11:49.914475 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f45kw\" (UniqueName: \"kubernetes.io/projected/ecfe0080-5a6d-4580-957f-9b07016a6f38-kube-api-access-f45kw\") pod \"nmstate-operator-858ddd8f98-qxq76\" (UID: \"ecfe0080-5a6d-4580-957f-9b07016a6f38\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" Oct 07 13:11:50 crc kubenswrapper[4959]: I1007 13:11:50.015919 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f45kw\" (UniqueName: \"kubernetes.io/projected/ecfe0080-5a6d-4580-957f-9b07016a6f38-kube-api-access-f45kw\") pod \"nmstate-operator-858ddd8f98-qxq76\" (UID: \"ecfe0080-5a6d-4580-957f-9b07016a6f38\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" Oct 07 13:11:50 crc kubenswrapper[4959]: I1007 13:11:50.038932 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f45kw\" (UniqueName: \"kubernetes.io/projected/ecfe0080-5a6d-4580-957f-9b07016a6f38-kube-api-access-f45kw\") pod \"nmstate-operator-858ddd8f98-qxq76\" (UID: \"ecfe0080-5a6d-4580-957f-9b07016a6f38\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" Oct 07 13:11:50 crc kubenswrapper[4959]: I1007 13:11:50.153854 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" Oct 07 13:11:50 crc kubenswrapper[4959]: I1007 13:11:50.641195 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qxq76"] Oct 07 13:11:51 crc kubenswrapper[4959]: I1007 13:11:51.193016 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" event={"ID":"ecfe0080-5a6d-4580-957f-9b07016a6f38","Type":"ContainerStarted","Data":"28b3d484f83c38d29887b37db952a13fd0fb247eb49d2f8db92867c785a48c66"} Oct 07 13:11:54 crc kubenswrapper[4959]: I1007 13:11:54.208689 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" event={"ID":"ecfe0080-5a6d-4580-957f-9b07016a6f38","Type":"ContainerStarted","Data":"85f705aae2fe44196737367b2267711ca028a85c7912a2f961b1200ad0751d3e"} Oct 07 13:11:54 crc kubenswrapper[4959]: I1007 13:11:54.223198 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qxq76" podStartSLOduration=2.723721008 podStartE2EDuration="5.223181383s" podCreationTimestamp="2025-10-07 13:11:49 +0000 UTC" firstStartedPulling="2025-10-07 13:11:50.65170635 +0000 UTC m=+662.812429027" lastFinishedPulling="2025-10-07 13:11:53.151166725 +0000 UTC m=+665.311889402" observedRunningTime="2025-10-07 13:11:54.222192854 +0000 UTC m=+666.382915531" watchObservedRunningTime="2025-10-07 13:11:54.223181383 +0000 UTC m=+666.383904050" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.118332 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.119543 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.121718 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z92v7" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.131582 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.154138 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-648vt"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.154936 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.156211 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.163613 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-trnlr"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.164319 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.206909 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-648vt"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.304887 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nspnx\" (UniqueName: \"kubernetes.io/projected/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-kube-api-access-nspnx\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305383 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-798jt\" (UniqueName: \"kubernetes.io/projected/0ce82e41-c87c-4a6a-85c5-63fa6986a917-kube-api-access-798jt\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305498 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-ovs-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305600 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhntj\" (UniqueName: \"kubernetes.io/projected/b74adf76-6b8e-4df4-a786-b241afc85aaf-kube-api-access-qhntj\") pod \"nmstate-metrics-fdff9cb8d-bqt8v\" (UID: \"b74adf76-6b8e-4df4-a786-b241afc85aaf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305734 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305836 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-dbus-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.305915 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-nmstate-lock\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.347247 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.347925 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.350853 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.351126 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.351612 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zwx52" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.361221 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406565 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d7e0c-6f6c-4305-88f1-316fda279894-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406599 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e91d7e0c-6f6c-4305-88f1-316fda279894-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406661 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-dbus-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406692 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-nmstate-lock\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406728 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nspnx\" (UniqueName: \"kubernetes.io/projected/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-kube-api-access-nspnx\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406746 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-798jt\" (UniqueName: \"kubernetes.io/projected/0ce82e41-c87c-4a6a-85c5-63fa6986a917-kube-api-access-798jt\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406779 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-ovs-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406797 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhntj\" (UniqueName: \"kubernetes.io/projected/b74adf76-6b8e-4df4-a786-b241afc85aaf-kube-api-access-qhntj\") pod \"nmstate-metrics-fdff9cb8d-bqt8v\" (UID: \"b74adf76-6b8e-4df4-a786-b241afc85aaf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406791 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-nmstate-lock\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406819 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2smv5\" (UniqueName: \"kubernetes.io/projected/e91d7e0c-6f6c-4305-88f1-316fda279894-kube-api-access-2smv5\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406863 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-ovs-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.406975 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ce82e41-c87c-4a6a-85c5-63fa6986a917-dbus-socket\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.423032 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.425328 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhntj\" (UniqueName: \"kubernetes.io/projected/b74adf76-6b8e-4df4-a786-b241afc85aaf-kube-api-access-qhntj\") pod \"nmstate-metrics-fdff9cb8d-bqt8v\" (UID: \"b74adf76-6b8e-4df4-a786-b241afc85aaf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.425865 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nspnx\" (UniqueName: \"kubernetes.io/projected/b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb-kube-api-access-nspnx\") pod \"nmstate-webhook-6cdbc54649-648vt\" (UID: \"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.436364 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-798jt\" (UniqueName: \"kubernetes.io/projected/0ce82e41-c87c-4a6a-85c5-63fa6986a917-kube-api-access-798jt\") pod \"nmstate-handler-trnlr\" (UID: \"0ce82e41-c87c-4a6a-85c5-63fa6986a917\") " pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.460826 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.486500 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.497430 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.507815 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2smv5\" (UniqueName: \"kubernetes.io/projected/e91d7e0c-6f6c-4305-88f1-316fda279894-kube-api-access-2smv5\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.507861 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d7e0c-6f6c-4305-88f1-316fda279894-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.507884 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e91d7e0c-6f6c-4305-88f1-316fda279894-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.508758 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e91d7e0c-6f6c-4305-88f1-316fda279894-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.515425 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d7e0c-6f6c-4305-88f1-316fda279894-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.530642 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2smv5\" (UniqueName: \"kubernetes.io/projected/e91d7e0c-6f6c-4305-88f1-316fda279894-kube-api-access-2smv5\") pod \"nmstate-console-plugin-6b874cbd85-bc4dg\" (UID: \"e91d7e0c-6f6c-4305-88f1-316fda279894\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.532797 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75cc8fb5cc-56zfx"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.533499 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.546726 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cc8fb5cc-56zfx"] Oct 07 13:11:55 crc kubenswrapper[4959]: W1007 13:11:55.578809 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce82e41_c87c_4a6a_85c5_63fa6986a917.slice/crio-75a7f09a26cc708c63ce5f7c94b787d318b53404019618bd82dffec32249914e WatchSource:0}: Error finding container 75a7f09a26cc708c63ce5f7c94b787d318b53404019618bd82dffec32249914e: Status 404 returned error can't find the container with id 75a7f09a26cc708c63ce5f7c94b787d318b53404019618bd82dffec32249914e Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.662213 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.709690 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-trusted-ca-bundle\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710037 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710065 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-console-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710092 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-service-ca\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-oauth-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710146 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwjcs\" (UniqueName: \"kubernetes.io/projected/09f4a382-546a-4db4-a583-6323ad2bef7b-kube-api-access-rwjcs\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.710189 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-oauth-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.777451 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-648vt"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.811903 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-oauth-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.811987 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-trusted-ca-bundle\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.812018 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.812058 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-console-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.812078 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-service-ca\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.812099 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-oauth-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.812141 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwjcs\" (UniqueName: \"kubernetes.io/projected/09f4a382-546a-4db4-a583-6323ad2bef7b-kube-api-access-rwjcs\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.813109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-console-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.813436 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-service-ca\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.813470 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-oauth-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.813938 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09f4a382-546a-4db4-a583-6323ad2bef7b-trusted-ca-bundle\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.818542 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-oauth-config\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.824639 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09f4a382-546a-4db4-a583-6323ad2bef7b-console-serving-cert\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.830753 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwjcs\" (UniqueName: \"kubernetes.io/projected/09f4a382-546a-4db4-a583-6323ad2bef7b-kube-api-access-rwjcs\") pod \"console-75cc8fb5cc-56zfx\" (UID: \"09f4a382-546a-4db4-a583-6323ad2bef7b\") " pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.856800 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg"] Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.858357 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:11:55 crc kubenswrapper[4959]: W1007 13:11:55.867873 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode91d7e0c_6f6c_4305_88f1_316fda279894.slice/crio-b8a856492f71abb887e08168a6d7ade7a76aaff14e17b621e9854702f57e4c3b WatchSource:0}: Error finding container b8a856492f71abb887e08168a6d7ade7a76aaff14e17b621e9854702f57e4c3b: Status 404 returned error can't find the container with id b8a856492f71abb887e08168a6d7ade7a76aaff14e17b621e9854702f57e4c3b Oct 07 13:11:55 crc kubenswrapper[4959]: I1007 13:11:55.921917 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v"] Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.045892 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cc8fb5cc-56zfx"] Oct 07 13:11:56 crc kubenswrapper[4959]: W1007 13:11:56.051286 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f4a382_546a_4db4_a583_6323ad2bef7b.slice/crio-06105bf0867670de7c803e4ab123fd3fc04fb8fcbe349917e8c3e88863c729cf WatchSource:0}: Error finding container 06105bf0867670de7c803e4ab123fd3fc04fb8fcbe349917e8c3e88863c729cf: Status 404 returned error can't find the container with id 06105bf0867670de7c803e4ab123fd3fc04fb8fcbe349917e8c3e88863c729cf Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.221662 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" event={"ID":"b74adf76-6b8e-4df4-a786-b241afc85aaf","Type":"ContainerStarted","Data":"e233dc51cd4d904431fcb0b9a4c5db6da50a6a3e764de27cc8655177fc67c94d"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.222767 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-trnlr" event={"ID":"0ce82e41-c87c-4a6a-85c5-63fa6986a917","Type":"ContainerStarted","Data":"75a7f09a26cc708c63ce5f7c94b787d318b53404019618bd82dffec32249914e"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.223469 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" event={"ID":"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb","Type":"ContainerStarted","Data":"cd6e44d366df2d47b438b842121c73cee9ca1bd9f78119be50826fa86277a713"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.225252 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cc8fb5cc-56zfx" event={"ID":"09f4a382-546a-4db4-a583-6323ad2bef7b","Type":"ContainerStarted","Data":"3e9f0f4e5a2ed8653367d86d3ca925eb29ff50c6f0b0c71a90cc6ccaf3734211"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.225297 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cc8fb5cc-56zfx" event={"ID":"09f4a382-546a-4db4-a583-6323ad2bef7b","Type":"ContainerStarted","Data":"06105bf0867670de7c803e4ab123fd3fc04fb8fcbe349917e8c3e88863c729cf"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.225968 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" event={"ID":"e91d7e0c-6f6c-4305-88f1-316fda279894","Type":"ContainerStarted","Data":"b8a856492f71abb887e08168a6d7ade7a76aaff14e17b621e9854702f57e4c3b"} Oct 07 13:11:56 crc kubenswrapper[4959]: I1007 13:11:56.245532 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75cc8fb5cc-56zfx" podStartSLOduration=1.24551177 podStartE2EDuration="1.24551177s" podCreationTimestamp="2025-10-07 13:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:11:56.240498924 +0000 UTC m=+668.401221621" watchObservedRunningTime="2025-10-07 13:11:56.24551177 +0000 UTC m=+668.406234447" Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.246135 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" event={"ID":"e91d7e0c-6f6c-4305-88f1-316fda279894","Type":"ContainerStarted","Data":"d00b3933d4e238a469a87c7a0d2214b41f67549533472caf49b40a63c3eb6f72"} Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.247367 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" event={"ID":"b74adf76-6b8e-4df4-a786-b241afc85aaf","Type":"ContainerStarted","Data":"931da29bb702ce31910168c30623c1db5fdab0d0bb866e629e65cb920d3ed0f1"} Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.248476 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-trnlr" event={"ID":"0ce82e41-c87c-4a6a-85c5-63fa6986a917","Type":"ContainerStarted","Data":"0f780dcb9e0fbcfc3f5fa985f91644d89a775eb961f0b2e56027ed38dc1bcd15"} Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.249083 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.250368 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" event={"ID":"b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb","Type":"ContainerStarted","Data":"1027f4e0ab042db0e49549a871babaedd31f249f8af7dfa063c4b920298adf76"} Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.250743 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.266102 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bc4dg" podStartSLOduration=1.4220911 podStartE2EDuration="4.266087704s" podCreationTimestamp="2025-10-07 13:11:55 +0000 UTC" firstStartedPulling="2025-10-07 13:11:55.872431098 +0000 UTC m=+668.033153775" lastFinishedPulling="2025-10-07 13:11:58.716427702 +0000 UTC m=+670.877150379" observedRunningTime="2025-10-07 13:11:59.26251514 +0000 UTC m=+671.423237827" watchObservedRunningTime="2025-10-07 13:11:59.266087704 +0000 UTC m=+671.426810381" Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.285393 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-trnlr" podStartSLOduration=1.14501675 podStartE2EDuration="4.285376387s" podCreationTimestamp="2025-10-07 13:11:55 +0000 UTC" firstStartedPulling="2025-10-07 13:11:55.581548134 +0000 UTC m=+667.742270811" lastFinishedPulling="2025-10-07 13:11:58.721907751 +0000 UTC m=+670.882630448" observedRunningTime="2025-10-07 13:11:59.281553845 +0000 UTC m=+671.442276532" watchObservedRunningTime="2025-10-07 13:11:59.285376387 +0000 UTC m=+671.446099064" Oct 07 13:11:59 crc kubenswrapper[4959]: I1007 13:11:59.302083 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" podStartSLOduration=1.3760220969999999 podStartE2EDuration="4.302058893s" podCreationTimestamp="2025-10-07 13:11:55 +0000 UTC" firstStartedPulling="2025-10-07 13:11:55.799051018 +0000 UTC m=+667.959773695" lastFinishedPulling="2025-10-07 13:11:58.725087814 +0000 UTC m=+670.885810491" observedRunningTime="2025-10-07 13:11:59.29575707 +0000 UTC m=+671.456479767" watchObservedRunningTime="2025-10-07 13:11:59.302058893 +0000 UTC m=+671.462781580" Oct 07 13:12:01 crc kubenswrapper[4959]: I1007 13:12:01.263963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" event={"ID":"b74adf76-6b8e-4df4-a786-b241afc85aaf","Type":"ContainerStarted","Data":"6bb8f7ab0db2f7415fac05811496c218632cd0320bbd3172f0031ce299cb64e5"} Oct 07 13:12:01 crc kubenswrapper[4959]: I1007 13:12:01.283760 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-bqt8v" podStartSLOduration=1.316464319 podStartE2EDuration="6.283744415s" podCreationTimestamp="2025-10-07 13:11:55 +0000 UTC" firstStartedPulling="2025-10-07 13:11:55.956912652 +0000 UTC m=+668.117635329" lastFinishedPulling="2025-10-07 13:12:00.924192728 +0000 UTC m=+673.084915425" observedRunningTime="2025-10-07 13:12:01.282601311 +0000 UTC m=+673.443323998" watchObservedRunningTime="2025-10-07 13:12:01.283744415 +0000 UTC m=+673.444467092" Oct 07 13:12:05 crc kubenswrapper[4959]: I1007 13:12:05.529791 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-trnlr" Oct 07 13:12:05 crc kubenswrapper[4959]: I1007 13:12:05.858491 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:12:05 crc kubenswrapper[4959]: I1007 13:12:05.858556 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:12:05 crc kubenswrapper[4959]: I1007 13:12:05.864493 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:12:06 crc kubenswrapper[4959]: I1007 13:12:06.294667 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75cc8fb5cc-56zfx" Oct 07 13:12:06 crc kubenswrapper[4959]: I1007 13:12:06.337024 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:12:15 crc kubenswrapper[4959]: I1007 13:12:15.491889 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-648vt" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.132030 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb"] Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.134358 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.137843 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.140324 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb"] Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.288336 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.288678 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn2w\" (UniqueName: \"kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.288717 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.389950 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.390016 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn2w\" (UniqueName: \"kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.390051 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.390512 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.390580 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.407889 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn2w\" (UniqueName: \"kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.450323 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:29 crc kubenswrapper[4959]: I1007 13:12:29.664478 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb"] Oct 07 13:12:30 crc kubenswrapper[4959]: I1007 13:12:30.441554 4959 generic.go:334] "Generic (PLEG): container finished" podID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerID="ae05083283242f08921d5e12b2007c4fdf9d0bd83bc6bc6ca1b163986d937a3c" exitCode=0 Oct 07 13:12:30 crc kubenswrapper[4959]: I1007 13:12:30.441607 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" event={"ID":"927d2a77-5bce-4896-aec7-67a816a96bf0","Type":"ContainerDied","Data":"ae05083283242f08921d5e12b2007c4fdf9d0bd83bc6bc6ca1b163986d937a3c"} Oct 07 13:12:30 crc kubenswrapper[4959]: I1007 13:12:30.441656 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" event={"ID":"927d2a77-5bce-4896-aec7-67a816a96bf0","Type":"ContainerStarted","Data":"bf259da3dc6aca706f111d4f5bc26d07b56564c6c6e628782bca54c54bc303a2"} Oct 07 13:12:31 crc kubenswrapper[4959]: I1007 13:12:31.385971 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-j72km" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" containerID="cri-o://fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526" gracePeriod=15 Oct 07 13:12:31 crc kubenswrapper[4959]: I1007 13:12:31.868702 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-j72km_dd723ca8-9cfa-465f-b706-feaa015d9e0d/console/0.log" Oct 07 13:12:31 crc kubenswrapper[4959]: I1007 13:12:31.868793 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.023481 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024012 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024108 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024177 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024273 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024308 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbl7w\" (UniqueName: \"kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024355 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config\") pod \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\" (UID: \"dd723ca8-9cfa-465f-b706-feaa015d9e0d\") " Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.024944 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.026264 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config" (OuterVolumeSpecName: "console-config") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.026414 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.026802 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.030272 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.030885 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.031736 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w" (OuterVolumeSpecName: "kube-api-access-lbl7w") pod "dd723ca8-9cfa-465f-b706-feaa015d9e0d" (UID: "dd723ca8-9cfa-465f-b706-feaa015d9e0d"). InnerVolumeSpecName "kube-api-access-lbl7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125258 4959 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125317 4959 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125332 4959 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125350 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbl7w\" (UniqueName: \"kubernetes.io/projected/dd723ca8-9cfa-465f-b706-feaa015d9e0d-kube-api-access-lbl7w\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125368 4959 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125382 4959 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd723ca8-9cfa-465f-b706-feaa015d9e0d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.125397 4959 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd723ca8-9cfa-465f-b706-feaa015d9e0d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457444 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-j72km_dd723ca8-9cfa-465f-b706-feaa015d9e0d/console/0.log" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457505 4959 generic.go:334] "Generic (PLEG): container finished" podID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerID="fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526" exitCode=2 Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457540 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j72km" event={"ID":"dd723ca8-9cfa-465f-b706-feaa015d9e0d","Type":"ContainerDied","Data":"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526"} Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457573 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j72km" event={"ID":"dd723ca8-9cfa-465f-b706-feaa015d9e0d","Type":"ContainerDied","Data":"69e75aad4d2b43e7600043e5fe2d700df66749853eeaf5ef64ba245d55a469a6"} Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457593 4959 scope.go:117] "RemoveContainer" containerID="fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.457666 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j72km" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.492569 4959 scope.go:117] "RemoveContainer" containerID="fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526" Oct 07 13:12:32 crc kubenswrapper[4959]: E1007 13:12:32.494404 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526\": container with ID starting with fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526 not found: ID does not exist" containerID="fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.494448 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526"} err="failed to get container status \"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526\": rpc error: code = NotFound desc = could not find container \"fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526\": container with ID starting with fea2db08e8e8c045e09cea044084978f9b244970eb510aa700fae99ac59cb526 not found: ID does not exist" Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.514730 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.519079 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-j72km"] Oct 07 13:12:32 crc kubenswrapper[4959]: I1007 13:12:32.815999 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" path="/var/lib/kubelet/pods/dd723ca8-9cfa-465f-b706-feaa015d9e0d/volumes" Oct 07 13:12:33 crc kubenswrapper[4959]: I1007 13:12:33.464561 4959 generic.go:334] "Generic (PLEG): container finished" podID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerID="8b8bfd82de673abe147e4a0fcbd2201482b9e416e5f0c6ec5c45e586882abda7" exitCode=0 Oct 07 13:12:33 crc kubenswrapper[4959]: I1007 13:12:33.464707 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" event={"ID":"927d2a77-5bce-4896-aec7-67a816a96bf0","Type":"ContainerDied","Data":"8b8bfd82de673abe147e4a0fcbd2201482b9e416e5f0c6ec5c45e586882abda7"} Oct 07 13:12:34 crc kubenswrapper[4959]: I1007 13:12:34.478273 4959 generic.go:334] "Generic (PLEG): container finished" podID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerID="189a1fd29bc1e29ea930d722646674f61cc779d48cd42bd87fd18722636b019a" exitCode=0 Oct 07 13:12:34 crc kubenswrapper[4959]: I1007 13:12:34.478335 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" event={"ID":"927d2a77-5bce-4896-aec7-67a816a96bf0","Type":"ContainerDied","Data":"189a1fd29bc1e29ea930d722646674f61cc779d48cd42bd87fd18722636b019a"} Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.767069 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.887036 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util\") pod \"927d2a77-5bce-4896-aec7-67a816a96bf0\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.887359 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle\") pod \"927d2a77-5bce-4896-aec7-67a816a96bf0\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.887486 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn2w\" (UniqueName: \"kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w\") pod \"927d2a77-5bce-4896-aec7-67a816a96bf0\" (UID: \"927d2a77-5bce-4896-aec7-67a816a96bf0\") " Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.888201 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle" (OuterVolumeSpecName: "bundle") pod "927d2a77-5bce-4896-aec7-67a816a96bf0" (UID: "927d2a77-5bce-4896-aec7-67a816a96bf0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.894901 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w" (OuterVolumeSpecName: "kube-api-access-fsn2w") pod "927d2a77-5bce-4896-aec7-67a816a96bf0" (UID: "927d2a77-5bce-4896-aec7-67a816a96bf0"). InnerVolumeSpecName "kube-api-access-fsn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.896794 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util" (OuterVolumeSpecName: "util") pod "927d2a77-5bce-4896-aec7-67a816a96bf0" (UID: "927d2a77-5bce-4896-aec7-67a816a96bf0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.988735 4959 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.988977 4959 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927d2a77-5bce-4896-aec7-67a816a96bf0-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:35 crc kubenswrapper[4959]: I1007 13:12:35.989094 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn2w\" (UniqueName: \"kubernetes.io/projected/927d2a77-5bce-4896-aec7-67a816a96bf0-kube-api-access-fsn2w\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:36 crc kubenswrapper[4959]: I1007 13:12:36.495483 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" event={"ID":"927d2a77-5bce-4896-aec7-67a816a96bf0","Type":"ContainerDied","Data":"bf259da3dc6aca706f111d4f5bc26d07b56564c6c6e628782bca54c54bc303a2"} Oct 07 13:12:36 crc kubenswrapper[4959]: I1007 13:12:36.495539 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf259da3dc6aca706f111d4f5bc26d07b56564c6c6e628782bca54c54bc303a2" Oct 07 13:12:36 crc kubenswrapper[4959]: I1007 13:12:36.495567 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.392339 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl"] Oct 07 13:12:44 crc kubenswrapper[4959]: E1007 13:12:44.394270 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="pull" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.394354 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="pull" Oct 07 13:12:44 crc kubenswrapper[4959]: E1007 13:12:44.394410 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.394462 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" Oct 07 13:12:44 crc kubenswrapper[4959]: E1007 13:12:44.394515 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="extract" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.394568 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="extract" Oct 07 13:12:44 crc kubenswrapper[4959]: E1007 13:12:44.394645 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="util" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.394741 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="util" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.394973 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd723ca8-9cfa-465f-b706-feaa015d9e0d" containerName="console" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.395074 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="927d2a77-5bce-4896-aec7-67a816a96bf0" containerName="extract" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.395575 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.404088 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.404318 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.406133 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.406171 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mrttd" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.409473 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.446734 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl"] Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.505751 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lztrm\" (UniqueName: \"kubernetes.io/projected/c2450601-6bf6-4ee8-af46-dafc0db98d8c-kube-api-access-lztrm\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.506056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-apiservice-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.506191 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-webhook-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.607674 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lztrm\" (UniqueName: \"kubernetes.io/projected/c2450601-6bf6-4ee8-af46-dafc0db98d8c-kube-api-access-lztrm\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.607752 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-apiservice-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.608108 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-webhook-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.622413 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-webhook-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.622444 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2450601-6bf6-4ee8-af46-dafc0db98d8c-apiservice-cert\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.626262 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lztrm\" (UniqueName: \"kubernetes.io/projected/c2450601-6bf6-4ee8-af46-dafc0db98d8c-kube-api-access-lztrm\") pod \"metallb-operator-controller-manager-79dcdc88ff-jv2fl\" (UID: \"c2450601-6bf6-4ee8-af46-dafc0db98d8c\") " pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.689733 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-678c485567-fb7ps"] Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.690507 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.691872 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.693594 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.696032 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7bhfx" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.702601 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-678c485567-fb7ps"] Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.714293 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.812098 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-webhook-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.812528 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcsnw\" (UniqueName: \"kubernetes.io/projected/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-kube-api-access-bcsnw\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.812569 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-apiservice-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.914990 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-webhook-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.915050 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcsnw\" (UniqueName: \"kubernetes.io/projected/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-kube-api-access-bcsnw\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.915085 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-apiservice-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.921771 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-webhook-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.934289 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-apiservice-cert\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.937750 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcsnw\" (UniqueName: \"kubernetes.io/projected/d1fbc67f-df69-48a4-87a0-e9d429eca6f1-kube-api-access-bcsnw\") pod \"metallb-operator-webhook-server-678c485567-fb7ps\" (UID: \"d1fbc67f-df69-48a4-87a0-e9d429eca6f1\") " pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:44 crc kubenswrapper[4959]: I1007 13:12:44.948064 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl"] Oct 07 13:12:45 crc kubenswrapper[4959]: I1007 13:12:45.007705 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:45 crc kubenswrapper[4959]: I1007 13:12:45.226572 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-678c485567-fb7ps"] Oct 07 13:12:45 crc kubenswrapper[4959]: W1007 13:12:45.233777 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1fbc67f_df69_48a4_87a0_e9d429eca6f1.slice/crio-ddd818fc1eaefb300d0cd137ebf86eb28762945c77318a5f117d80baad02e604 WatchSource:0}: Error finding container ddd818fc1eaefb300d0cd137ebf86eb28762945c77318a5f117d80baad02e604: Status 404 returned error can't find the container with id ddd818fc1eaefb300d0cd137ebf86eb28762945c77318a5f117d80baad02e604 Oct 07 13:12:45 crc kubenswrapper[4959]: I1007 13:12:45.544418 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" event={"ID":"d1fbc67f-df69-48a4-87a0-e9d429eca6f1","Type":"ContainerStarted","Data":"ddd818fc1eaefb300d0cd137ebf86eb28762945c77318a5f117d80baad02e604"} Oct 07 13:12:45 crc kubenswrapper[4959]: I1007 13:12:45.545502 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" event={"ID":"c2450601-6bf6-4ee8-af46-dafc0db98d8c","Type":"ContainerStarted","Data":"5d329d4b94a8f112940f4f7e20b2671f9dabfb1f3986581a9f7b94c42471f471"} Oct 07 13:12:48 crc kubenswrapper[4959]: I1007 13:12:48.563484 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" event={"ID":"c2450601-6bf6-4ee8-af46-dafc0db98d8c","Type":"ContainerStarted","Data":"d412ab0ca8f757a6bc8e1a3f69d5426b0361051abbf32f7dc12a7bdde81eda0a"} Oct 07 13:12:48 crc kubenswrapper[4959]: I1007 13:12:48.564158 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:12:48 crc kubenswrapper[4959]: I1007 13:12:48.586948 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" podStartSLOduration=1.849294191 podStartE2EDuration="4.586934231s" podCreationTimestamp="2025-10-07 13:12:44 +0000 UTC" firstStartedPulling="2025-10-07 13:12:44.960343505 +0000 UTC m=+717.121066182" lastFinishedPulling="2025-10-07 13:12:47.697983545 +0000 UTC m=+719.858706222" observedRunningTime="2025-10-07 13:12:48.583237696 +0000 UTC m=+720.743960373" watchObservedRunningTime="2025-10-07 13:12:48.586934231 +0000 UTC m=+720.747656908" Oct 07 13:12:50 crc kubenswrapper[4959]: I1007 13:12:50.576655 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" event={"ID":"d1fbc67f-df69-48a4-87a0-e9d429eca6f1","Type":"ContainerStarted","Data":"5077d6fb9c39ed7bd71029dead095df779572041a9de22571ef38fce19ba0a71"} Oct 07 13:12:50 crc kubenswrapper[4959]: I1007 13:12:50.577040 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:12:50 crc kubenswrapper[4959]: I1007 13:12:50.614786 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" podStartSLOduration=2.344421788 podStartE2EDuration="6.614752015s" podCreationTimestamp="2025-10-07 13:12:44 +0000 UTC" firstStartedPulling="2025-10-07 13:12:45.236137 +0000 UTC m=+717.396859677" lastFinishedPulling="2025-10-07 13:12:49.506467227 +0000 UTC m=+721.667189904" observedRunningTime="2025-10-07 13:12:50.608926519 +0000 UTC m=+722.769649226" watchObservedRunningTime="2025-10-07 13:12:50.614752015 +0000 UTC m=+722.775474772" Oct 07 13:13:05 crc kubenswrapper[4959]: I1007 13:13:05.013614 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-678c485567-fb7ps" Oct 07 13:13:07 crc kubenswrapper[4959]: I1007 13:13:07.695405 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:13:07 crc kubenswrapper[4959]: I1007 13:13:07.695756 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.477968 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.478608 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" containerID="cri-o://120071c18d87cb632242d156e447b8d2373b30c737f42f0f6b24a6f4f1712edd" gracePeriod=30 Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.590240 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.590475 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" containerID="cri-o://4ef1600272ac8a55eaaf8559867dc70e33c4573f1404af4dbc9ec6695af9c696" gracePeriod=30 Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.748989 4959 generic.go:334] "Generic (PLEG): container finished" podID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerID="120071c18d87cb632242d156e447b8d2373b30c737f42f0f6b24a6f4f1712edd" exitCode=0 Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.749369 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" event={"ID":"6fd888a7-3fd1-4d12-8005-94fdae5be125","Type":"ContainerDied","Data":"120071c18d87cb632242d156e447b8d2373b30c737f42f0f6b24a6f4f1712edd"} Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.762368 4959 generic.go:334] "Generic (PLEG): container finished" podID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerID="4ef1600272ac8a55eaaf8559867dc70e33c4573f1404af4dbc9ec6695af9c696" exitCode=0 Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.762410 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" event={"ID":"58bc42fb-61ba-4342-98c6-45535e156eb6","Type":"ContainerDied","Data":"4ef1600272ac8a55eaaf8559867dc70e33c4573f1404af4dbc9ec6695af9c696"} Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.909836 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:13:20 crc kubenswrapper[4959]: I1007 13:13:20.962819 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077575 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmzc4\" (UniqueName: \"kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4\") pod \"58bc42fb-61ba-4342-98c6-45535e156eb6\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077622 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmtdr\" (UniqueName: \"kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr\") pod \"6fd888a7-3fd1-4d12-8005-94fdae5be125\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077698 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert\") pod \"58bc42fb-61ba-4342-98c6-45535e156eb6\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077747 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles\") pod \"6fd888a7-3fd1-4d12-8005-94fdae5be125\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077767 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config\") pod \"6fd888a7-3fd1-4d12-8005-94fdae5be125\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077814 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca\") pod \"58bc42fb-61ba-4342-98c6-45535e156eb6\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.077855 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca\") pod \"6fd888a7-3fd1-4d12-8005-94fdae5be125\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.078518 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fd888a7-3fd1-4d12-8005-94fdae5be125" (UID: "6fd888a7-3fd1-4d12-8005-94fdae5be125"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.078543 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "58bc42fb-61ba-4342-98c6-45535e156eb6" (UID: "58bc42fb-61ba-4342-98c6-45535e156eb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.078650 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert\") pod \"6fd888a7-3fd1-4d12-8005-94fdae5be125\" (UID: \"6fd888a7-3fd1-4d12-8005-94fdae5be125\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.078678 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config" (OuterVolumeSpecName: "config") pod "6fd888a7-3fd1-4d12-8005-94fdae5be125" (UID: "6fd888a7-3fd1-4d12-8005-94fdae5be125"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079083 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6fd888a7-3fd1-4d12-8005-94fdae5be125" (UID: "6fd888a7-3fd1-4d12-8005-94fdae5be125"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079167 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config\") pod \"58bc42fb-61ba-4342-98c6-45535e156eb6\" (UID: \"58bc42fb-61ba-4342-98c6-45535e156eb6\") " Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079534 4959 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079557 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079566 4959 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079574 4959 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fd888a7-3fd1-4d12-8005-94fdae5be125-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.079879 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config" (OuterVolumeSpecName: "config") pod "58bc42fb-61ba-4342-98c6-45535e156eb6" (UID: "58bc42fb-61ba-4342-98c6-45535e156eb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.083580 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr" (OuterVolumeSpecName: "kube-api-access-fmtdr") pod "6fd888a7-3fd1-4d12-8005-94fdae5be125" (UID: "6fd888a7-3fd1-4d12-8005-94fdae5be125"). InnerVolumeSpecName "kube-api-access-fmtdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.083586 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58bc42fb-61ba-4342-98c6-45535e156eb6" (UID: "58bc42fb-61ba-4342-98c6-45535e156eb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.083620 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4" (OuterVolumeSpecName: "kube-api-access-lmzc4") pod "58bc42fb-61ba-4342-98c6-45535e156eb6" (UID: "58bc42fb-61ba-4342-98c6-45535e156eb6"). InnerVolumeSpecName "kube-api-access-lmzc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.083699 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fd888a7-3fd1-4d12-8005-94fdae5be125" (UID: "6fd888a7-3fd1-4d12-8005-94fdae5be125"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.181461 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd888a7-3fd1-4d12-8005-94fdae5be125-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.181514 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58bc42fb-61ba-4342-98c6-45535e156eb6-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.181533 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmzc4\" (UniqueName: \"kubernetes.io/projected/58bc42fb-61ba-4342-98c6-45535e156eb6-kube-api-access-lmzc4\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.181552 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmtdr\" (UniqueName: \"kubernetes.io/projected/6fd888a7-3fd1-4d12-8005-94fdae5be125-kube-api-access-fmtdr\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.181571 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc42fb-61ba-4342-98c6-45535e156eb6-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.769050 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" event={"ID":"58bc42fb-61ba-4342-98c6-45535e156eb6","Type":"ContainerDied","Data":"42dbb7bbc09af41386ae22f0fe0451cb7a60da086f44398677247e55a9804a16"} Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.769117 4959 scope.go:117] "RemoveContainer" containerID="4ef1600272ac8a55eaaf8559867dc70e33c4573f1404af4dbc9ec6695af9c696" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.769129 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.772160 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" event={"ID":"6fd888a7-3fd1-4d12-8005-94fdae5be125","Type":"ContainerDied","Data":"1f991f34a4c0bf74977101d05f5dfbdf4957db2fab389c670739c8fbeaf792ce"} Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.772266 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cptrk" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.793227 4959 scope.go:117] "RemoveContainer" containerID="120071c18d87cb632242d156e447b8d2373b30c737f42f0f6b24a6f4f1712edd" Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.814001 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.819573 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cptrk"] Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.822294 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:13:21 crc kubenswrapper[4959]: I1007 13:13:21.825803 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r984j"] Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.185126 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf"] Oct 07 13:13:22 crc kubenswrapper[4959]: E1007 13:13:22.185668 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.185695 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: E1007 13:13:22.185712 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.185722 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.185927 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" containerName="controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.185950 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" containerName="route-controller-manager" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.186649 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.188062 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-589878b787-ffndt"] Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.188876 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.191938 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192012 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192016 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192053 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192244 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192406 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.192702 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.193027 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.193070 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.193273 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.193330 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.195124 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.203497 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-589878b787-ffndt"] Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.206262 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.223453 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf"] Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292775 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-config\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292821 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-client-ca\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292844 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfz6\" (UniqueName: \"kubernetes.io/projected/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-kube-api-access-crfz6\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292868 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-proxy-ca-bundles\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292887 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hthf2\" (UniqueName: \"kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292907 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-serving-cert\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.292933 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.293188 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.293279 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.305424 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf"] Oct 07 13:13:22 crc kubenswrapper[4959]: E1007 13:13:22.305974 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-hthf2 serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" podUID="a62b9ab3-6dee-49ca-a871-1c375c84cc6b" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.394312 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfz6\" (UniqueName: \"kubernetes.io/projected/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-kube-api-access-crfz6\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.394592 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-proxy-ca-bundles\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.394751 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hthf2\" (UniqueName: \"kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.394873 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-serving-cert\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.394959 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.395073 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.395157 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.395273 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-config\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.395350 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-client-ca\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.396048 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.396484 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.396501 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-client-ca\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.396962 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-proxy-ca-bundles\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.397117 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-config\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.402510 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-serving-cert\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.402581 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.412465 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfz6\" (UniqueName: \"kubernetes.io/projected/5d762056-e3a2-4bf9-b847-c2ee4ea864c6-kube-api-access-crfz6\") pod \"controller-manager-589878b787-ffndt\" (UID: \"5d762056-e3a2-4bf9-b847-c2ee4ea864c6\") " pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.416307 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hthf2\" (UniqueName: \"kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2\") pod \"route-controller-manager-57b766fdc9-dzmpf\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.523029 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.708283 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-589878b787-ffndt"] Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.779650 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" event={"ID":"5d762056-e3a2-4bf9-b847-c2ee4ea864c6","Type":"ContainerStarted","Data":"054322fb57ec66b07d617da88b0a75c2026d2dc6def432c4a9a3610b7ccc6aab"} Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.780559 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.820173 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bc42fb-61ba-4342-98c6-45535e156eb6" path="/var/lib/kubelet/pods/58bc42fb-61ba-4342-98c6-45535e156eb6/volumes" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.821314 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd888a7-3fd1-4d12-8005-94fdae5be125" path="/var/lib/kubelet/pods/6fd888a7-3fd1-4d12-8005-94fdae5be125/volumes" Oct 07 13:13:22 crc kubenswrapper[4959]: I1007 13:13:22.903113 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.017116 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hthf2\" (UniqueName: \"kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2\") pod \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.017525 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config\") pod \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.017572 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca\") pod \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.017612 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert\") pod \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\" (UID: \"a62b9ab3-6dee-49ca-a871-1c375c84cc6b\") " Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.018264 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a62b9ab3-6dee-49ca-a871-1c375c84cc6b" (UID: "a62b9ab3-6dee-49ca-a871-1c375c84cc6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.018429 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config" (OuterVolumeSpecName: "config") pod "a62b9ab3-6dee-49ca-a871-1c375c84cc6b" (UID: "a62b9ab3-6dee-49ca-a871-1c375c84cc6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.041527 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2" (OuterVolumeSpecName: "kube-api-access-hthf2") pod "a62b9ab3-6dee-49ca-a871-1c375c84cc6b" (UID: "a62b9ab3-6dee-49ca-a871-1c375c84cc6b"). InnerVolumeSpecName "kube-api-access-hthf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.041669 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a62b9ab3-6dee-49ca-a871-1c375c84cc6b" (UID: "a62b9ab3-6dee-49ca-a871-1c375c84cc6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.119711 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hthf2\" (UniqueName: \"kubernetes.io/projected/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-kube-api-access-hthf2\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.119782 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.119797 4959 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.119810 4959 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b9ab3-6dee-49ca-a871-1c375c84cc6b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.786952 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.786981 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" event={"ID":"5d762056-e3a2-4bf9-b847-c2ee4ea864c6","Type":"ContainerStarted","Data":"043dbafca5dd5aa62fda928ef956176244cd47e696cf4b0eae86045b32ed3df2"} Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.787388 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.794199 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.803734 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-589878b787-ffndt" podStartSLOduration=3.803708893 podStartE2EDuration="3.803708893s" podCreationTimestamp="2025-10-07 13:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:13:23.802331624 +0000 UTC m=+755.963054311" watchObservedRunningTime="2025-10-07 13:13:23.803708893 +0000 UTC m=+755.964431580" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.866510 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf"] Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.877001 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g"] Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.878179 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.881956 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.882105 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.882202 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.882208 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.882305 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.887705 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.888408 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b766fdc9-dzmpf"] Oct 07 13:13:23 crc kubenswrapper[4959]: I1007 13:13:23.898014 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g"] Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.031728 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbfc5c-8710-42b4-bff3-5f793508dc0a-serving-cert\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.031829 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-client-ca\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.031879 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-config\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.031911 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgvl\" (UniqueName: \"kubernetes.io/projected/12cbfc5c-8710-42b4-bff3-5f793508dc0a-kube-api-access-7mgvl\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.133350 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbfc5c-8710-42b4-bff3-5f793508dc0a-serving-cert\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.133434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-client-ca\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.133486 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-config\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.133539 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgvl\" (UniqueName: \"kubernetes.io/projected/12cbfc5c-8710-42b4-bff3-5f793508dc0a-kube-api-access-7mgvl\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.134709 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-client-ca\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.135335 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbfc5c-8710-42b4-bff3-5f793508dc0a-config\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.138425 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbfc5c-8710-42b4-bff3-5f793508dc0a-serving-cert\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.150889 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgvl\" (UniqueName: \"kubernetes.io/projected/12cbfc5c-8710-42b4-bff3-5f793508dc0a-kube-api-access-7mgvl\") pod \"route-controller-manager-65b9fb487c-dxk6g\" (UID: \"12cbfc5c-8710-42b4-bff3-5f793508dc0a\") " pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.214560 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.475668 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g"] Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.717316 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79dcdc88ff-jv2fl" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.795446 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" event={"ID":"12cbfc5c-8710-42b4-bff3-5f793508dc0a","Type":"ContainerStarted","Data":"ed20f673c6802dc3ef66cf1463746a1fff882921156f48b9813a1ee18eb63ef5"} Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.795492 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" event={"ID":"12cbfc5c-8710-42b4-bff3-5f793508dc0a","Type":"ContainerStarted","Data":"a25247299dbc406559602305ee7e8dff95def01219eb1e39e9f1e17cfb5ea2bf"} Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.795891 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.816203 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62b9ab3-6dee-49ca-a871-1c375c84cc6b" path="/var/lib/kubelet/pods/a62b9ab3-6dee-49ca-a871-1c375c84cc6b/volumes" Oct 07 13:13:24 crc kubenswrapper[4959]: I1007 13:13:24.822678 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" podStartSLOduration=2.822602742 podStartE2EDuration="2.822602742s" podCreationTimestamp="2025-10-07 13:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:13:24.821597513 +0000 UTC m=+756.982320210" watchObservedRunningTime="2025-10-07 13:13:24.822602742 +0000 UTC m=+756.983325419" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.126989 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b9fb487c-dxk6g" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.506002 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.506610 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.508706 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-j7tbs" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.508783 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.516388 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.523066 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cf76d"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.525120 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.531045 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.531118 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.614758 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dgrrl"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.615773 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.618864 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t5rzj" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.621396 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.621459 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.621396 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.628407 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-7b7r7"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.629224 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.633242 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.643676 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7b7r7"] Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651775 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-startup\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651826 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics-certs\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651848 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-sockets\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651866 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnjs\" (UniqueName: \"kubernetes.io/projected/ea4413f6-7433-4301-856f-51073cbf20b0-kube-api-access-wgnjs\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651885 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpq9\" (UniqueName: \"kubernetes.io/projected/a786d6e0-64e4-4bfb-a93a-673b9d775053-kube-api-access-7rpq9\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.651987 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea4413f6-7433-4301-856f-51073cbf20b0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.652011 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-reloader\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.652033 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-conf\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.652176 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.752873 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-startup\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.752935 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics-certs\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.752955 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-sockets\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.752973 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnjs\" (UniqueName: \"kubernetes.io/projected/ea4413f6-7433-4301-856f-51073cbf20b0-kube-api-access-wgnjs\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.752988 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpq9\" (UniqueName: \"kubernetes.io/projected/a786d6e0-64e4-4bfb-a93a-673b9d775053-kube-api-access-7rpq9\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753013 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87bkn\" (UniqueName: \"kubernetes.io/projected/4a978343-2c48-4153-a20e-631bbe3c1595-kube-api-access-87bkn\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753033 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sjzl\" (UniqueName: \"kubernetes.io/projected/ec5e2185-a03f-459b-95ce-cf8a04c9742d-kube-api-access-5sjzl\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753055 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea4413f6-7433-4301-856f-51073cbf20b0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753079 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-reloader\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753102 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a978343-2c48-4153-a20e-631bbe3c1595-metallb-excludel2\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753118 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-conf\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753142 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753160 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-metrics-certs\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753181 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753203 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-metrics-certs\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753223 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-cert\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.753681 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-sockets\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.754907 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-startup\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.755112 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-frr-conf\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.755413 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-reloader\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.755576 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.760834 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea4413f6-7433-4301-856f-51073cbf20b0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.779504 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a786d6e0-64e4-4bfb-a93a-673b9d775053-metrics-certs\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.782117 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnjs\" (UniqueName: \"kubernetes.io/projected/ea4413f6-7433-4301-856f-51073cbf20b0-kube-api-access-wgnjs\") pod \"frr-k8s-webhook-server-64bf5d555-thbcx\" (UID: \"ea4413f6-7433-4301-856f-51073cbf20b0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.783320 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpq9\" (UniqueName: \"kubernetes.io/projected/a786d6e0-64e4-4bfb-a93a-673b9d775053-kube-api-access-7rpq9\") pod \"frr-k8s-cf76d\" (UID: \"a786d6e0-64e4-4bfb-a93a-673b9d775053\") " pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.834461 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.844398 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854565 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854656 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-metrics-certs\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854696 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-cert\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854753 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87bkn\" (UniqueName: \"kubernetes.io/projected/4a978343-2c48-4153-a20e-631bbe3c1595-kube-api-access-87bkn\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854781 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sjzl\" (UniqueName: \"kubernetes.io/projected/ec5e2185-a03f-459b-95ce-cf8a04c9742d-kube-api-access-5sjzl\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854833 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a978343-2c48-4153-a20e-631bbe3c1595-metallb-excludel2\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: E1007 13:13:25.854834 4959 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.854878 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-metrics-certs\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: E1007 13:13:25.854922 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist podName:4a978343-2c48-4153-a20e-631bbe3c1595 nodeName:}" failed. No retries permitted until 2025-10-07 13:13:26.354900141 +0000 UTC m=+758.515622818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist") pod "speaker-dgrrl" (UID: "4a978343-2c48-4153-a20e-631bbe3c1595") : secret "metallb-memberlist" not found Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.856184 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a978343-2c48-4153-a20e-631bbe3c1595-metallb-excludel2\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.858887 4959 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.859530 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-metrics-certs\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.860791 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-metrics-certs\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.872048 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e2185-a03f-459b-95ce-cf8a04c9742d-cert\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.877369 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87bkn\" (UniqueName: \"kubernetes.io/projected/4a978343-2c48-4153-a20e-631bbe3c1595-kube-api-access-87bkn\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.884560 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sjzl\" (UniqueName: \"kubernetes.io/projected/ec5e2185-a03f-459b-95ce-cf8a04c9742d-kube-api-access-5sjzl\") pod \"controller-68d546b9d8-7b7r7\" (UID: \"ec5e2185-a03f-459b-95ce-cf8a04c9742d\") " pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:25 crc kubenswrapper[4959]: I1007 13:13:25.949405 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.278994 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx"] Oct 07 13:13:26 crc kubenswrapper[4959]: W1007 13:13:26.282772 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea4413f6_7433_4301_856f_51073cbf20b0.slice/crio-f0d886b8271d4442008c52d1ba7b65eeea7413eec9b65307e9e5dfa35b74600b WatchSource:0}: Error finding container f0d886b8271d4442008c52d1ba7b65eeea7413eec9b65307e9e5dfa35b74600b: Status 404 returned error can't find the container with id f0d886b8271d4442008c52d1ba7b65eeea7413eec9b65307e9e5dfa35b74600b Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.363329 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:26 crc kubenswrapper[4959]: E1007 13:13:26.363524 4959 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 13:13:26 crc kubenswrapper[4959]: E1007 13:13:26.363581 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist podName:4a978343-2c48-4153-a20e-631bbe3c1595 nodeName:}" failed. No retries permitted until 2025-10-07 13:13:27.363563953 +0000 UTC m=+759.524286640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist") pod "speaker-dgrrl" (UID: "4a978343-2c48-4153-a20e-631bbe3c1595") : secret "metallb-memberlist" not found Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.424102 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7b7r7"] Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816324 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816762 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"7f3c4375bafe5aa38395f19c9586cbc9cf2c76a452ccde570f314e1aee7223bc"} Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816782 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7b7r7" event={"ID":"ec5e2185-a03f-459b-95ce-cf8a04c9742d","Type":"ContainerStarted","Data":"aa80d231793038eb0820abaab9fc2f8e0fbd1cc2c7417dbbd8da3cba758e6c88"} Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816796 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7b7r7" event={"ID":"ec5e2185-a03f-459b-95ce-cf8a04c9742d","Type":"ContainerStarted","Data":"ec258ed099d3ef575b2bf66fd44e16ad6b215b7e88474c230fa4df6ce1dd1b24"} Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816813 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7b7r7" event={"ID":"ec5e2185-a03f-459b-95ce-cf8a04c9742d","Type":"ContainerStarted","Data":"b70ac2f329d500c2be4857757f47bbe1d0510f4b6370552a5344567d311b0c66"} Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.816828 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" event={"ID":"ea4413f6-7433-4301-856f-51073cbf20b0","Type":"ContainerStarted","Data":"f0d886b8271d4442008c52d1ba7b65eeea7413eec9b65307e9e5dfa35b74600b"} Oct 07 13:13:26 crc kubenswrapper[4959]: I1007 13:13:26.838344 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-7b7r7" podStartSLOduration=1.838311772 podStartE2EDuration="1.838311772s" podCreationTimestamp="2025-10-07 13:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:13:26.827224567 +0000 UTC m=+758.987947324" watchObservedRunningTime="2025-10-07 13:13:26.838311772 +0000 UTC m=+758.999034489" Oct 07 13:13:27 crc kubenswrapper[4959]: I1007 13:13:27.380875 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:27 crc kubenswrapper[4959]: I1007 13:13:27.387426 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a978343-2c48-4153-a20e-631bbe3c1595-memberlist\") pod \"speaker-dgrrl\" (UID: \"4a978343-2c48-4153-a20e-631bbe3c1595\") " pod="metallb-system/speaker-dgrrl" Oct 07 13:13:27 crc kubenswrapper[4959]: I1007 13:13:27.441669 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dgrrl" Oct 07 13:13:27 crc kubenswrapper[4959]: I1007 13:13:27.822773 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dgrrl" event={"ID":"4a978343-2c48-4153-a20e-631bbe3c1595","Type":"ContainerStarted","Data":"55b20095d67b76f3d13bab4c9cbf7a1076e7d89b210ad16f2ebb996d53e13265"} Oct 07 13:13:27 crc kubenswrapper[4959]: I1007 13:13:27.823108 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dgrrl" event={"ID":"4a978343-2c48-4153-a20e-631bbe3c1595","Type":"ContainerStarted","Data":"d2e50fd59a9f6d90aa3b20a59f0902372d24d552322cec9f073153f5d84f3f97"} Oct 07 13:13:28 crc kubenswrapper[4959]: I1007 13:13:28.834926 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dgrrl" event={"ID":"4a978343-2c48-4153-a20e-631bbe3c1595","Type":"ContainerStarted","Data":"cd8ee7a7c64fc0b13d3051efd6d6c652e208dcaf9665088dab98f4919076d760"} Oct 07 13:13:28 crc kubenswrapper[4959]: I1007 13:13:28.835519 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dgrrl" Oct 07 13:13:31 crc kubenswrapper[4959]: I1007 13:13:31.527144 4959 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:13:33 crc kubenswrapper[4959]: I1007 13:13:33.881008 4959 generic.go:334] "Generic (PLEG): container finished" podID="a786d6e0-64e4-4bfb-a93a-673b9d775053" containerID="f3abbeeb47f1de2633f3662fd6403c15696a61476bd5c11da1f6b89328fa910b" exitCode=0 Oct 07 13:13:33 crc kubenswrapper[4959]: I1007 13:13:33.881111 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerDied","Data":"f3abbeeb47f1de2633f3662fd6403c15696a61476bd5c11da1f6b89328fa910b"} Oct 07 13:13:33 crc kubenswrapper[4959]: I1007 13:13:33.882822 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" event={"ID":"ea4413f6-7433-4301-856f-51073cbf20b0","Type":"ContainerStarted","Data":"8f535909ac1641fb37078980273405f7d9b6d36b468235d736f745b176a4e537"} Oct 07 13:13:33 crc kubenswrapper[4959]: I1007 13:13:33.883069 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:33 crc kubenswrapper[4959]: I1007 13:13:33.918946 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dgrrl" podStartSLOduration=8.918929952 podStartE2EDuration="8.918929952s" podCreationTimestamp="2025-10-07 13:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:13:28.857300044 +0000 UTC m=+761.018022721" watchObservedRunningTime="2025-10-07 13:13:33.918929952 +0000 UTC m=+766.079652629" Oct 07 13:13:34 crc kubenswrapper[4959]: I1007 13:13:34.891902 4959 generic.go:334] "Generic (PLEG): container finished" podID="a786d6e0-64e4-4bfb-a93a-673b9d775053" containerID="2bfb1df8281c592db4c49e23b6f61ad91e6dcda3667a317a0604c191a2d67f8e" exitCode=0 Oct 07 13:13:34 crc kubenswrapper[4959]: I1007 13:13:34.892007 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerDied","Data":"2bfb1df8281c592db4c49e23b6f61ad91e6dcda3667a317a0604c191a2d67f8e"} Oct 07 13:13:34 crc kubenswrapper[4959]: I1007 13:13:34.925987 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" podStartSLOduration=3.036147083 podStartE2EDuration="9.925969583s" podCreationTimestamp="2025-10-07 13:13:25 +0000 UTC" firstStartedPulling="2025-10-07 13:13:26.284382374 +0000 UTC m=+758.445105041" lastFinishedPulling="2025-10-07 13:13:33.174204864 +0000 UTC m=+765.334927541" observedRunningTime="2025-10-07 13:13:33.938879979 +0000 UTC m=+766.099602666" watchObservedRunningTime="2025-10-07 13:13:34.925969583 +0000 UTC m=+767.086692270" Oct 07 13:13:35 crc kubenswrapper[4959]: I1007 13:13:35.899469 4959 generic.go:334] "Generic (PLEG): container finished" podID="a786d6e0-64e4-4bfb-a93a-673b9d775053" containerID="58c6e936f35b6684bd469a721cd7a879905dbd39b7f65a798415280aaf1bcf70" exitCode=0 Oct 07 13:13:35 crc kubenswrapper[4959]: I1007 13:13:35.899656 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerDied","Data":"58c6e936f35b6684bd469a721cd7a879905dbd39b7f65a798415280aaf1bcf70"} Oct 07 13:13:36 crc kubenswrapper[4959]: I1007 13:13:36.914055 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"f6bf328daa3b8fc934faded756bbb6d07308e600dd2bc57b28f5cf0d54df3835"} Oct 07 13:13:36 crc kubenswrapper[4959]: I1007 13:13:36.914443 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"f29b85d9ec8817f6cffa395f840963d3718e285bfc05a68fe738a9381b38ec47"} Oct 07 13:13:36 crc kubenswrapper[4959]: I1007 13:13:36.914458 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"60cdc7315a059ae80fd6bbe0f777bc46b8e884d3db4e35c5a5e0c9db91d861f3"} Oct 07 13:13:36 crc kubenswrapper[4959]: I1007 13:13:36.914470 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"6b55776f3b29b9d4ac261ad9f9829f83eb178b1e47a2899e1cafa8671acba43c"} Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.444902 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dgrrl" Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.695555 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.695607 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.923516 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"8208a0b1d3de2355eed771ade467027871be9a6baba09af6c37223fd55a8c43f"} Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.923562 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cf76d" event={"ID":"a786d6e0-64e4-4bfb-a93a-673b9d775053","Type":"ContainerStarted","Data":"10b080e107f49fcad40e5243bd64811c3ecd22567b60e1578f418d1b8599cb17"} Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.923857 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:37 crc kubenswrapper[4959]: I1007 13:13:37.950225 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cf76d" podStartSLOduration=5.769656786 podStartE2EDuration="12.950206926s" podCreationTimestamp="2025-10-07 13:13:25 +0000 UTC" firstStartedPulling="2025-10-07 13:13:25.987966262 +0000 UTC m=+758.148688949" lastFinishedPulling="2025-10-07 13:13:33.168516422 +0000 UTC m=+765.329239089" observedRunningTime="2025-10-07 13:13:37.9485908 +0000 UTC m=+770.109313507" watchObservedRunningTime="2025-10-07 13:13:37.950206926 +0000 UTC m=+770.110929603" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.362356 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.363266 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.368543 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.369345 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.373588 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.460316 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbcml\" (UniqueName: \"kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml\") pod \"openstack-operator-index-v5pzm\" (UID: \"386a1602-9230-4c90-ac17-8c518e5a9825\") " pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.561384 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbcml\" (UniqueName: \"kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml\") pod \"openstack-operator-index-v5pzm\" (UID: \"386a1602-9230-4c90-ac17-8c518e5a9825\") " pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.579093 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbcml\" (UniqueName: \"kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml\") pod \"openstack-operator-index-v5pzm\" (UID: \"386a1602-9230-4c90-ac17-8c518e5a9825\") " pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.692287 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.846209 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:40 crc kubenswrapper[4959]: I1007 13:13:40.893914 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:41 crc kubenswrapper[4959]: I1007 13:13:41.088619 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:41 crc kubenswrapper[4959]: W1007 13:13:41.101343 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386a1602_9230_4c90_ac17_8c518e5a9825.slice/crio-f21a8ca75e399d1541043a482a9a4dbacee3ef1c80cec5aeb31ec2d5d8fd7a44 WatchSource:0}: Error finding container f21a8ca75e399d1541043a482a9a4dbacee3ef1c80cec5aeb31ec2d5d8fd7a44: Status 404 returned error can't find the container with id f21a8ca75e399d1541043a482a9a4dbacee3ef1c80cec5aeb31ec2d5d8fd7a44 Oct 07 13:13:41 crc kubenswrapper[4959]: I1007 13:13:41.962954 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v5pzm" event={"ID":"386a1602-9230-4c90-ac17-8c518e5a9825","Type":"ContainerStarted","Data":"f21a8ca75e399d1541043a482a9a4dbacee3ef1c80cec5aeb31ec2d5d8fd7a44"} Oct 07 13:13:42 crc kubenswrapper[4959]: I1007 13:13:42.972007 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v5pzm" event={"ID":"386a1602-9230-4c90-ac17-8c518e5a9825","Type":"ContainerStarted","Data":"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5"} Oct 07 13:13:42 crc kubenswrapper[4959]: I1007 13:13:42.997315 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v5pzm" podStartSLOduration=2.067411804 podStartE2EDuration="2.997283122s" podCreationTimestamp="2025-10-07 13:13:40 +0000 UTC" firstStartedPulling="2025-10-07 13:13:41.104582068 +0000 UTC m=+773.265304745" lastFinishedPulling="2025-10-07 13:13:42.034453396 +0000 UTC m=+774.195176063" observedRunningTime="2025-10-07 13:13:42.989298925 +0000 UTC m=+775.150021612" watchObservedRunningTime="2025-10-07 13:13:42.997283122 +0000 UTC m=+775.158005839" Oct 07 13:13:43 crc kubenswrapper[4959]: I1007 13:13:43.744680 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.349177 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vptcw"] Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.350126 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.352345 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wsszr" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.360637 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vptcw"] Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.538640 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5wcj\" (UniqueName: \"kubernetes.io/projected/8a32157a-8fdd-4430-9d22-3401166e4352-kube-api-access-w5wcj\") pod \"openstack-operator-index-vptcw\" (UID: \"8a32157a-8fdd-4430-9d22-3401166e4352\") " pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.640861 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5wcj\" (UniqueName: \"kubernetes.io/projected/8a32157a-8fdd-4430-9d22-3401166e4352-kube-api-access-w5wcj\") pod \"openstack-operator-index-vptcw\" (UID: \"8a32157a-8fdd-4430-9d22-3401166e4352\") " pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.671823 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5wcj\" (UniqueName: \"kubernetes.io/projected/8a32157a-8fdd-4430-9d22-3401166e4352-kube-api-access-w5wcj\") pod \"openstack-operator-index-vptcw\" (UID: \"8a32157a-8fdd-4430-9d22-3401166e4352\") " pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.684888 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:44 crc kubenswrapper[4959]: I1007 13:13:44.985199 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v5pzm" podUID="386a1602-9230-4c90-ac17-8c518e5a9825" containerName="registry-server" containerID="cri-o://f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5" gracePeriod=2 Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.085027 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vptcw"] Oct 07 13:13:45 crc kubenswrapper[4959]: W1007 13:13:45.090440 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a32157a_8fdd_4430_9d22_3401166e4352.slice/crio-d9bf0b96696943d4a651ac8ad8e3017368b48468d63ca25bd2e02d55f1c11e76 WatchSource:0}: Error finding container d9bf0b96696943d4a651ac8ad8e3017368b48468d63ca25bd2e02d55f1c11e76: Status 404 returned error can't find the container with id d9bf0b96696943d4a651ac8ad8e3017368b48468d63ca25bd2e02d55f1c11e76 Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.466934 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.653013 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbcml\" (UniqueName: \"kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml\") pod \"386a1602-9230-4c90-ac17-8c518e5a9825\" (UID: \"386a1602-9230-4c90-ac17-8c518e5a9825\") " Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.658020 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml" (OuterVolumeSpecName: "kube-api-access-cbcml") pod "386a1602-9230-4c90-ac17-8c518e5a9825" (UID: "386a1602-9230-4c90-ac17-8c518e5a9825"). InnerVolumeSpecName "kube-api-access-cbcml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.754265 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbcml\" (UniqueName: \"kubernetes.io/projected/386a1602-9230-4c90-ac17-8c518e5a9825-kube-api-access-cbcml\") on node \"crc\" DevicePath \"\"" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.840064 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-thbcx" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.956749 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-7b7r7" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.998827 4959 generic.go:334] "Generic (PLEG): container finished" podID="386a1602-9230-4c90-ac17-8c518e5a9825" containerID="f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5" exitCode=0 Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.998892 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v5pzm" event={"ID":"386a1602-9230-4c90-ac17-8c518e5a9825","Type":"ContainerDied","Data":"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5"} Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.998914 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v5pzm" Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.998958 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v5pzm" event={"ID":"386a1602-9230-4c90-ac17-8c518e5a9825","Type":"ContainerDied","Data":"f21a8ca75e399d1541043a482a9a4dbacee3ef1c80cec5aeb31ec2d5d8fd7a44"} Oct 07 13:13:45 crc kubenswrapper[4959]: I1007 13:13:45.998984 4959 scope.go:117] "RemoveContainer" containerID="f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5" Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.001657 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vptcw" event={"ID":"8a32157a-8fdd-4430-9d22-3401166e4352","Type":"ContainerStarted","Data":"19496ec7b8b8fa32c43c273297c43406076f84fe129a6222b01b1e1ba1e09032"} Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.001701 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vptcw" event={"ID":"8a32157a-8fdd-4430-9d22-3401166e4352","Type":"ContainerStarted","Data":"d9bf0b96696943d4a651ac8ad8e3017368b48468d63ca25bd2e02d55f1c11e76"} Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.021549 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vptcw" podStartSLOduration=1.582158396 podStartE2EDuration="2.021529039s" podCreationTimestamp="2025-10-07 13:13:44 +0000 UTC" firstStartedPulling="2025-10-07 13:13:45.093973175 +0000 UTC m=+777.254695852" lastFinishedPulling="2025-10-07 13:13:45.533343818 +0000 UTC m=+777.694066495" observedRunningTime="2025-10-07 13:13:46.016985848 +0000 UTC m=+778.177708585" watchObservedRunningTime="2025-10-07 13:13:46.021529039 +0000 UTC m=+778.182251726" Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.026421 4959 scope.go:117] "RemoveContainer" containerID="f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5" Oct 07 13:13:46 crc kubenswrapper[4959]: E1007 13:13:46.026983 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5\": container with ID starting with f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5 not found: ID does not exist" containerID="f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5" Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.027065 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5"} err="failed to get container status \"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5\": rpc error: code = NotFound desc = could not find container \"f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5\": container with ID starting with f7ad163cdf1fdbc3971449cf80d45678db0ea56979ac17124a0cdcebfbdb42e5 not found: ID does not exist" Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.047149 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.051287 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v5pzm"] Oct 07 13:13:46 crc kubenswrapper[4959]: I1007 13:13:46.815993 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386a1602-9230-4c90-ac17-8c518e5a9825" path="/var/lib/kubelet/pods/386a1602-9230-4c90-ac17-8c518e5a9825/volumes" Oct 07 13:13:54 crc kubenswrapper[4959]: I1007 13:13:54.685316 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:54 crc kubenswrapper[4959]: I1007 13:13:54.686111 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:54 crc kubenswrapper[4959]: I1007 13:13:54.719012 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.086794 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-vptcw" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.849168 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cf76d" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.973787 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr"] Oct 07 13:13:55 crc kubenswrapper[4959]: E1007 13:13:55.974085 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a1602-9230-4c90-ac17-8c518e5a9825" containerName="registry-server" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.974099 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a1602-9230-4c90-ac17-8c518e5a9825" containerName="registry-server" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.974227 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="386a1602-9230-4c90-ac17-8c518e5a9825" containerName="registry-server" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.975138 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.976917 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4zlnv" Oct 07 13:13:55 crc kubenswrapper[4959]: I1007 13:13:55.984087 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr"] Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.088782 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.088834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bv4\" (UniqueName: \"kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.088855 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.190268 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bv4\" (UniqueName: \"kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.190307 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.190376 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.190806 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.191054 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.217251 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bv4\" (UniqueName: \"kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4\") pod \"563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.292380 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:13:56 crc kubenswrapper[4959]: I1007 13:13:56.731310 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr"] Oct 07 13:13:57 crc kubenswrapper[4959]: I1007 13:13:57.075996 4959 generic.go:334] "Generic (PLEG): container finished" podID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerID="79ca58a6a05111464531ee3237687e897ff8ed3c298b52a1744763768ac070ab" exitCode=0 Oct 07 13:13:57 crc kubenswrapper[4959]: I1007 13:13:57.076060 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" event={"ID":"1d76c837-d256-4ea9-a23f-e55ee516e726","Type":"ContainerDied","Data":"79ca58a6a05111464531ee3237687e897ff8ed3c298b52a1744763768ac070ab"} Oct 07 13:13:57 crc kubenswrapper[4959]: I1007 13:13:57.076092 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" event={"ID":"1d76c837-d256-4ea9-a23f-e55ee516e726","Type":"ContainerStarted","Data":"e6d367e325418caaa925d4475f8c5a8198dfdb18a4e753159dd77c3c0ae907c8"} Oct 07 13:13:59 crc kubenswrapper[4959]: I1007 13:13:59.092432 4959 generic.go:334] "Generic (PLEG): container finished" podID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerID="5a299651a9930a44b01cbf41521de27dcb8ee7a4780a11563badfe1abc5d744a" exitCode=0 Oct 07 13:13:59 crc kubenswrapper[4959]: I1007 13:13:59.092696 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" event={"ID":"1d76c837-d256-4ea9-a23f-e55ee516e726","Type":"ContainerDied","Data":"5a299651a9930a44b01cbf41521de27dcb8ee7a4780a11563badfe1abc5d744a"} Oct 07 13:14:00 crc kubenswrapper[4959]: I1007 13:14:00.106585 4959 generic.go:334] "Generic (PLEG): container finished" podID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerID="9eee8d278852a2675034ad1cb6e80b08dd7d88c7feec2e3ece68fdb6b243b0dc" exitCode=0 Oct 07 13:14:00 crc kubenswrapper[4959]: I1007 13:14:00.106724 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" event={"ID":"1d76c837-d256-4ea9-a23f-e55ee516e726","Type":"ContainerDied","Data":"9eee8d278852a2675034ad1cb6e80b08dd7d88c7feec2e3ece68fdb6b243b0dc"} Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.463172 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.587593 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util\") pod \"1d76c837-d256-4ea9-a23f-e55ee516e726\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.588017 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8bv4\" (UniqueName: \"kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4\") pod \"1d76c837-d256-4ea9-a23f-e55ee516e726\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.589275 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle\") pod \"1d76c837-d256-4ea9-a23f-e55ee516e726\" (UID: \"1d76c837-d256-4ea9-a23f-e55ee516e726\") " Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.591154 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle" (OuterVolumeSpecName: "bundle") pod "1d76c837-d256-4ea9-a23f-e55ee516e726" (UID: "1d76c837-d256-4ea9-a23f-e55ee516e726"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.594337 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4" (OuterVolumeSpecName: "kube-api-access-d8bv4") pod "1d76c837-d256-4ea9-a23f-e55ee516e726" (UID: "1d76c837-d256-4ea9-a23f-e55ee516e726"). InnerVolumeSpecName "kube-api-access-d8bv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.600798 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util" (OuterVolumeSpecName: "util") pod "1d76c837-d256-4ea9-a23f-e55ee516e726" (UID: "1d76c837-d256-4ea9-a23f-e55ee516e726"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.691858 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8bv4\" (UniqueName: \"kubernetes.io/projected/1d76c837-d256-4ea9-a23f-e55ee516e726-kube-api-access-d8bv4\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.691895 4959 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:01 crc kubenswrapper[4959]: I1007 13:14:01.691904 4959 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d76c837-d256-4ea9-a23f-e55ee516e726-util\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:02 crc kubenswrapper[4959]: I1007 13:14:02.124857 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" event={"ID":"1d76c837-d256-4ea9-a23f-e55ee516e726","Type":"ContainerDied","Data":"e6d367e325418caaa925d4475f8c5a8198dfdb18a4e753159dd77c3c0ae907c8"} Oct 07 13:14:02 crc kubenswrapper[4959]: I1007 13:14:02.124901 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d367e325418caaa925d4475f8c5a8198dfdb18a4e753159dd77c3c0ae907c8" Oct 07 13:14:02 crc kubenswrapper[4959]: I1007 13:14:02.125283 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr" Oct 07 13:14:07 crc kubenswrapper[4959]: I1007 13:14:07.695660 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:14:07 crc kubenswrapper[4959]: I1007 13:14:07.696124 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:14:07 crc kubenswrapper[4959]: I1007 13:14:07.696166 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:14:07 crc kubenswrapper[4959]: I1007 13:14:07.696699 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:14:07 crc kubenswrapper[4959]: I1007 13:14:07.696743 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee" gracePeriod=600 Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.173493 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee" exitCode=0 Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.173553 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee"} Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.173859 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1"} Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.173881 4959 scope.go:117] "RemoveContainer" containerID="083fdb2a36bd004d963c9bad52ff246ecb91d3e06944051b30461a673d36f5e0" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.670682 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:14:08 crc kubenswrapper[4959]: E1007 13:14:08.671130 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="pull" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.671142 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="pull" Oct 07 13:14:08 crc kubenswrapper[4959]: E1007 13:14:08.671152 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="util" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.671158 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="util" Oct 07 13:14:08 crc kubenswrapper[4959]: E1007 13:14:08.671172 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="extract" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.671177 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="extract" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.671287 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d76c837-d256-4ea9-a23f-e55ee516e726" containerName="extract" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.671872 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.678913 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4ch8l" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.748922 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.791551 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5fvp\" (UniqueName: \"kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp\") pod \"openstack-operator-controller-operator-57bc4467bb-mb7dq\" (UID: \"2273db8c-b41b-453c-a22d-5fbb57fd2178\") " pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.893144 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5fvp\" (UniqueName: \"kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp\") pod \"openstack-operator-controller-operator-57bc4467bb-mb7dq\" (UID: \"2273db8c-b41b-453c-a22d-5fbb57fd2178\") " pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.914777 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5fvp\" (UniqueName: \"kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp\") pod \"openstack-operator-controller-operator-57bc4467bb-mb7dq\" (UID: \"2273db8c-b41b-453c-a22d-5fbb57fd2178\") " pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:08 crc kubenswrapper[4959]: I1007 13:14:08.990296 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:09 crc kubenswrapper[4959]: I1007 13:14:09.476058 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:14:09 crc kubenswrapper[4959]: W1007 13:14:09.484810 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2273db8c_b41b_453c_a22d_5fbb57fd2178.slice/crio-436b9c634db4548066c50d42e9d7d38b42caabe0dd49c03763d31f5f19b726ac WatchSource:0}: Error finding container 436b9c634db4548066c50d42e9d7d38b42caabe0dd49c03763d31f5f19b726ac: Status 404 returned error can't find the container with id 436b9c634db4548066c50d42e9d7d38b42caabe0dd49c03763d31f5f19b726ac Oct 07 13:14:10 crc kubenswrapper[4959]: I1007 13:14:10.203764 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerStarted","Data":"436b9c634db4548066c50d42e9d7d38b42caabe0dd49c03763d31f5f19b726ac"} Oct 07 13:14:14 crc kubenswrapper[4959]: I1007 13:14:14.230606 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerStarted","Data":"e1923992d91364cb7b808758e020bed32bb1549bfd5c69e361b20a3941d70152"} Oct 07 13:14:16 crc kubenswrapper[4959]: I1007 13:14:16.247797 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerStarted","Data":"9bddedf47315e60b3586b3fc3e19d26c00edca23523d4c8be1280f3967cf4ffd"} Oct 07 13:14:16 crc kubenswrapper[4959]: I1007 13:14:16.249411 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:16 crc kubenswrapper[4959]: I1007 13:14:16.275268 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" podStartSLOduration=1.849224622 podStartE2EDuration="8.275251596s" podCreationTimestamp="2025-10-07 13:14:08 +0000 UTC" firstStartedPulling="2025-10-07 13:14:09.486346553 +0000 UTC m=+801.647069230" lastFinishedPulling="2025-10-07 13:14:15.912373527 +0000 UTC m=+808.073096204" observedRunningTime="2025-10-07 13:14:16.271771826 +0000 UTC m=+808.432494513" watchObservedRunningTime="2025-10-07 13:14:16.275251596 +0000 UTC m=+808.435974273" Oct 07 13:14:18 crc kubenswrapper[4959]: I1007 13:14:18.993039 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.586144 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.588069 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.603061 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.757871 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.757950 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbpx\" (UniqueName: \"kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.757992 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.858688 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.858735 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbpx\" (UniqueName: \"kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.858762 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.859580 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.859730 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.880776 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbpx\" (UniqueName: \"kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx\") pod \"certified-operators-vgd79\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:29 crc kubenswrapper[4959]: I1007 13:14:29.919981 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:30 crc kubenswrapper[4959]: I1007 13:14:30.368302 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:14:31 crc kubenswrapper[4959]: I1007 13:14:31.329942 4959 generic.go:334] "Generic (PLEG): container finished" podID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerID="27e07a6777033a2945a8f429311cf233b757be5a9a62fab38bc77ea6e3558fee" exitCode=0 Oct 07 13:14:31 crc kubenswrapper[4959]: I1007 13:14:31.330047 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerDied","Data":"27e07a6777033a2945a8f429311cf233b757be5a9a62fab38bc77ea6e3558fee"} Oct 07 13:14:31 crc kubenswrapper[4959]: I1007 13:14:31.330176 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerStarted","Data":"e797f489ca9aa8056041d492da526ee3d781ab57ca3428adb4ac5cf20422c824"} Oct 07 13:14:32 crc kubenswrapper[4959]: I1007 13:14:32.337001 4959 generic.go:334] "Generic (PLEG): container finished" podID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerID="72c8031d2f7d38267bb15a9529b702fcd570b6518eb660935a4b15530fc9e1b5" exitCode=0 Oct 07 13:14:32 crc kubenswrapper[4959]: I1007 13:14:32.337041 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerDied","Data":"72c8031d2f7d38267bb15a9529b702fcd570b6518eb660935a4b15530fc9e1b5"} Oct 07 13:14:33 crc kubenswrapper[4959]: I1007 13:14:33.346761 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerStarted","Data":"49b99173b6709c64e110ae83c5c64570e4187f648dd49c66d0783929e6580bb5"} Oct 07 13:14:33 crc kubenswrapper[4959]: I1007 13:14:33.367943 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgd79" podStartSLOduration=2.8729478410000002 podStartE2EDuration="4.367900039s" podCreationTimestamp="2025-10-07 13:14:29 +0000 UTC" firstStartedPulling="2025-10-07 13:14:31.331871136 +0000 UTC m=+823.492593823" lastFinishedPulling="2025-10-07 13:14:32.826823344 +0000 UTC m=+824.987546021" observedRunningTime="2025-10-07 13:14:33.364526332 +0000 UTC m=+825.525249009" watchObservedRunningTime="2025-10-07 13:14:33.367900039 +0000 UTC m=+825.528622716" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.662985 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.664412 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.666922 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8hspd" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.674996 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.676156 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.677853 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tglwx" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.686056 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.691164 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.699396 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.700474 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.701757 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hhgwz" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.717563 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.733340 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.734271 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.737277 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-cfppq" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.745049 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.749689 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.750662 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.753391 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8b4kw" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.767582 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.768543 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.771054 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-984zv" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.790285 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.804353 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.805596 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.810721 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.813214 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.813341 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j7h4m" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.820335 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.821673 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.824646 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-l6bct" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.826722 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.841685 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.842645 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.847437 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.848585 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dcv66" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.851786 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2cqh\" (UniqueName: \"kubernetes.io/projected/3e59fb97-6ef4-42a5-a264-506bdccd8a23-kube-api-access-t2cqh\") pod \"cinder-operator-controller-manager-84bd8f6848-zl4v9\" (UID: \"3e59fb97-6ef4-42a5-a264-506bdccd8a23\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.851949 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrtj\" (UniqueName: \"kubernetes.io/projected/4294ed44-d412-4366-959e-cb534ab792bc-kube-api-access-hjrtj\") pod \"heat-operator-controller-manager-7ccfc8cf49-d4g6g\" (UID: \"4294ed44-d412-4366-959e-cb534ab792bc\") " pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.852304 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65jf\" (UniqueName: \"kubernetes.io/projected/035c3aeb-396b-47bf-a588-562bb0f27f88-kube-api-access-k65jf\") pod \"glance-operator-controller-manager-fd648f65-rmk5h\" (UID: \"035c3aeb-396b-47bf-a588-562bb0f27f88\") " pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.852705 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjk4t\" (UniqueName: \"kubernetes.io/projected/01b13867-f984-4d88-af12-28fc3ebc0b9f-kube-api-access-wjk4t\") pod \"designate-operator-controller-manager-58d86cd59d-xmpkg\" (UID: \"01b13867-f984-4d88-af12-28fc3ebc0b9f\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.852795 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntfkk\" (UniqueName: \"kubernetes.io/projected/c2a805f1-946a-4b48-9e52-4f24b56bd43a-kube-api-access-ntfkk\") pod \"barbican-operator-controller-manager-64f56ff694-b4rhk\" (UID: \"c2a805f1-946a-4b48-9e52-4f24b56bd43a\") " pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.854703 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.866781 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.868035 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.871882 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.873019 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.873024 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4ll8l" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.878811 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.878990 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tk4jk" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.913737 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.927887 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-htbcn" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.942832 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977208 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2cqh\" (UniqueName: \"kubernetes.io/projected/3e59fb97-6ef4-42a5-a264-506bdccd8a23-kube-api-access-t2cqh\") pod \"cinder-operator-controller-manager-84bd8f6848-zl4v9\" (UID: \"3e59fb97-6ef4-42a5-a264-506bdccd8a23\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977560 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrtj\" (UniqueName: \"kubernetes.io/projected/4294ed44-d412-4366-959e-cb534ab792bc-kube-api-access-hjrtj\") pod \"heat-operator-controller-manager-7ccfc8cf49-d4g6g\" (UID: \"4294ed44-d412-4366-959e-cb534ab792bc\") " pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977657 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/77bcfec2-4667-4415-af5e-3009e5ea4999-kube-api-access-t24zr\") pod \"keystone-operator-controller-manager-5b84cc7657-r57lc\" (UID: \"77bcfec2-4667-4415-af5e-3009e5ea4999\") " pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977755 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977825 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65jf\" (UniqueName: \"kubernetes.io/projected/035c3aeb-396b-47bf-a588-562bb0f27f88-kube-api-access-k65jf\") pod \"glance-operator-controller-manager-fd648f65-rmk5h\" (UID: \"035c3aeb-396b-47bf-a588-562bb0f27f88\") " pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977895 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjk4t\" (UniqueName: \"kubernetes.io/projected/01b13867-f984-4d88-af12-28fc3ebc0b9f-kube-api-access-wjk4t\") pod \"designate-operator-controller-manager-58d86cd59d-xmpkg\" (UID: \"01b13867-f984-4d88-af12-28fc3ebc0b9f\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.977970 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntfkk\" (UniqueName: \"kubernetes.io/projected/c2a805f1-946a-4b48-9e52-4f24b56bd43a-kube-api-access-ntfkk\") pod \"barbican-operator-controller-manager-64f56ff694-b4rhk\" (UID: \"c2a805f1-946a-4b48-9e52-4f24b56bd43a\") " pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.978040 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8lh6\" (UniqueName: \"kubernetes.io/projected/749f8ff6-9e1c-45ef-948f-1f8c255b670e-kube-api-access-j8lh6\") pod \"ironic-operator-controller-manager-5467f8988c-6t98f\" (UID: \"749f8ff6-9e1c-45ef-948f-1f8c255b670e\") " pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.978115 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7sf\" (UniqueName: \"kubernetes.io/projected/e3213b12-9128-4d7c-8ec8-a731e6627de4-kube-api-access-vl7sf\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.978192 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9fv7\" (UniqueName: \"kubernetes.io/projected/539702ff-226a-4c31-b715-af9af8ae1205-kube-api-access-s9fv7\") pod \"horizon-operator-controller-manager-5b477879bc-nf6mt\" (UID: \"539702ff-226a-4c31-b715-af9af8ae1205\") " pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.987272 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn"] Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.988547 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:35 crc kubenswrapper[4959]: I1007 13:14:35.991413 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nwjv8" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.001701 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.027961 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2cqh\" (UniqueName: \"kubernetes.io/projected/3e59fb97-6ef4-42a5-a264-506bdccd8a23-kube-api-access-t2cqh\") pod \"cinder-operator-controller-manager-84bd8f6848-zl4v9\" (UID: \"3e59fb97-6ef4-42a5-a264-506bdccd8a23\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.031739 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrtj\" (UniqueName: \"kubernetes.io/projected/4294ed44-d412-4366-959e-cb534ab792bc-kube-api-access-hjrtj\") pod \"heat-operator-controller-manager-7ccfc8cf49-d4g6g\" (UID: \"4294ed44-d412-4366-959e-cb534ab792bc\") " pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.032180 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntfkk\" (UniqueName: \"kubernetes.io/projected/c2a805f1-946a-4b48-9e52-4f24b56bd43a-kube-api-access-ntfkk\") pod \"barbican-operator-controller-manager-64f56ff694-b4rhk\" (UID: \"c2a805f1-946a-4b48-9e52-4f24b56bd43a\") " pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.033918 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65jf\" (UniqueName: \"kubernetes.io/projected/035c3aeb-396b-47bf-a588-562bb0f27f88-kube-api-access-k65jf\") pod \"glance-operator-controller-manager-fd648f65-rmk5h\" (UID: \"035c3aeb-396b-47bf-a588-562bb0f27f88\") " pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.037931 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.039534 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjk4t\" (UniqueName: \"kubernetes.io/projected/01b13867-f984-4d88-af12-28fc3ebc0b9f-kube-api-access-wjk4t\") pod \"designate-operator-controller-manager-58d86cd59d-xmpkg\" (UID: \"01b13867-f984-4d88-af12-28fc3ebc0b9f\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.039614 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.051254 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-b2gfm" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.051809 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.062238 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.063715 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.074019 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.076692 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.082921 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.083139 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f4p6p" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.083998 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/77bcfec2-4667-4415-af5e-3009e5ea4999-kube-api-access-t24zr\") pod \"keystone-operator-controller-manager-5b84cc7657-r57lc\" (UID: \"77bcfec2-4667-4415-af5e-3009e5ea4999\") " pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084034 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084084 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8lh6\" (UniqueName: \"kubernetes.io/projected/749f8ff6-9e1c-45ef-948f-1f8c255b670e-kube-api-access-j8lh6\") pod \"ironic-operator-controller-manager-5467f8988c-6t98f\" (UID: \"749f8ff6-9e1c-45ef-948f-1f8c255b670e\") " pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084118 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wwn\" (UniqueName: \"kubernetes.io/projected/5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85-kube-api-access-j4wwn\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-6kq7p\" (UID: \"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084154 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7sf\" (UniqueName: \"kubernetes.io/projected/e3213b12-9128-4d7c-8ec8-a731e6627de4-kube-api-access-vl7sf\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084192 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqsk\" (UniqueName: \"kubernetes.io/projected/6e224af6-7095-4878-ba65-3a8e3f358968-kube-api-access-9rqsk\") pod \"manila-operator-controller-manager-7cb48dbc-26qz6\" (UID: \"6e224af6-7095-4878-ba65-3a8e3f358968\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084222 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9fv7\" (UniqueName: \"kubernetes.io/projected/539702ff-226a-4c31-b715-af9af8ae1205-kube-api-access-s9fv7\") pod \"horizon-operator-controller-manager-5b477879bc-nf6mt\" (UID: \"539702ff-226a-4c31-b715-af9af8ae1205\") " pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.084266 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnk2n\" (UniqueName: \"kubernetes.io/projected/eea6d6d3-ded0-4788-8901-34c02d659aee-kube-api-access-jnk2n\") pod \"neutron-operator-controller-manager-69b956fbf6-6tcd6\" (UID: \"eea6d6d3-ded0-4788-8901-34c02d659aee\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.085151 4959 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.085198 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert podName:e3213b12-9128-4d7c-8ec8-a731e6627de4 nodeName:}" failed. No retries permitted until 2025-10-07 13:14:36.585181148 +0000 UTC m=+828.745903815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert") pod "infra-operator-controller-manager-84788b6bc5-d772s" (UID: "e3213b12-9128-4d7c-8ec8-a731e6627de4") : secret "infra-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.113200 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.121503 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/77bcfec2-4667-4415-af5e-3009e5ea4999-kube-api-access-t24zr\") pod \"keystone-operator-controller-manager-5b84cc7657-r57lc\" (UID: \"77bcfec2-4667-4415-af5e-3009e5ea4999\") " pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.125828 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7sf\" (UniqueName: \"kubernetes.io/projected/e3213b12-9128-4d7c-8ec8-a731e6627de4-kube-api-access-vl7sf\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.138223 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.150252 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8lh6\" (UniqueName: \"kubernetes.io/projected/749f8ff6-9e1c-45ef-948f-1f8c255b670e-kube-api-access-j8lh6\") pod \"ironic-operator-controller-manager-5467f8988c-6t98f\" (UID: \"749f8ff6-9e1c-45ef-948f-1f8c255b670e\") " pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.150324 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.163134 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.164210 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.165201 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.165609 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.166097 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.167121 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vt4j8" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.172197 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9fv7\" (UniqueName: \"kubernetes.io/projected/539702ff-226a-4c31-b715-af9af8ae1205-kube-api-access-s9fv7\") pod \"horizon-operator-controller-manager-5b477879bc-nf6mt\" (UID: \"539702ff-226a-4c31-b715-af9af8ae1205\") " pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.177326 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.184899 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnk2n\" (UniqueName: \"kubernetes.io/projected/eea6d6d3-ded0-4788-8901-34c02d659aee-kube-api-access-jnk2n\") pod \"neutron-operator-controller-manager-69b956fbf6-6tcd6\" (UID: \"eea6d6d3-ded0-4788-8901-34c02d659aee\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.184974 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cj8\" (UniqueName: \"kubernetes.io/projected/171d0807-668d-4284-ab63-698401676fbe-kube-api-access-z5cj8\") pod \"octavia-operator-controller-manager-69f59f9d8-sfhzx\" (UID: \"171d0807-668d-4284-ab63-698401676fbe\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.185019 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wwn\" (UniqueName: \"kubernetes.io/projected/5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85-kube-api-access-j4wwn\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-6kq7p\" (UID: \"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.185048 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7whtd\" (UniqueName: \"kubernetes.io/projected/64d46cbf-e1a4-4673-9f0e-01371175a1f9-kube-api-access-7whtd\") pod \"nova-operator-controller-manager-6c9b57c67-t2wjn\" (UID: \"64d46cbf-e1a4-4673-9f0e-01371175a1f9\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.185078 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.185096 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqsk\" (UniqueName: \"kubernetes.io/projected/6e224af6-7095-4878-ba65-3a8e3f358968-kube-api-access-9rqsk\") pod \"manila-operator-controller-manager-7cb48dbc-26qz6\" (UID: \"6e224af6-7095-4878-ba65-3a8e3f358968\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.185130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxvx\" (UniqueName: \"kubernetes.io/projected/0e606b13-be7c-4699-bb4b-5c50ddf32426-kube-api-access-sbxvx\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.187949 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.191462 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.198286 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.198420 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2w7cr" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.203355 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-k6btb"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.205197 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.222175 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-k6btb"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.237324 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n2fnp" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.262405 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.283993 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.286065 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293121 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7whtd\" (UniqueName: \"kubernetes.io/projected/64d46cbf-e1a4-4673-9f0e-01371175a1f9-kube-api-access-7whtd\") pod \"nova-operator-controller-manager-6c9b57c67-t2wjn\" (UID: \"64d46cbf-e1a4-4673-9f0e-01371175a1f9\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293300 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.292685 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293440 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2tmp\" (UniqueName: \"kubernetes.io/projected/f8ddf44b-e556-40c6-a3f8-699d756434dd-kube-api-access-k2tmp\") pod \"ovn-operator-controller-manager-54d485fd9-2vwpz\" (UID: \"f8ddf44b-e556-40c6-a3f8-699d756434dd\") " pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293563 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxvx\" (UniqueName: \"kubernetes.io/projected/0e606b13-be7c-4699-bb4b-5c50ddf32426-kube-api-access-sbxvx\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.293689 4959 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.293761 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert podName:0e606b13-be7c-4699-bb4b-5c50ddf32426 nodeName:}" failed. No retries permitted until 2025-10-07 13:14:36.793741629 +0000 UTC m=+828.954464386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert") pod "openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" (UID: "0e606b13-be7c-4699-bb4b-5c50ddf32426") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hr7\" (UniqueName: \"kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293906 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cj8\" (UniqueName: \"kubernetes.io/projected/171d0807-668d-4284-ab63-698401676fbe-kube-api-access-z5cj8\") pod \"octavia-operator-controller-manager-69f59f9d8-sfhzx\" (UID: \"171d0807-668d-4284-ab63-698401676fbe\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.293982 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.292904 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.286560 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnk2n\" (UniqueName: \"kubernetes.io/projected/eea6d6d3-ded0-4788-8901-34c02d659aee-kube-api-access-jnk2n\") pod \"neutron-operator-controller-manager-69b956fbf6-6tcd6\" (UID: \"eea6d6d3-ded0-4788-8901-34c02d659aee\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.298034 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wwn\" (UniqueName: \"kubernetes.io/projected/5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85-kube-api-access-j4wwn\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-6kq7p\" (UID: \"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.309016 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqsk\" (UniqueName: \"kubernetes.io/projected/6e224af6-7095-4878-ba65-3a8e3f358968-kube-api-access-9rqsk\") pod \"manila-operator-controller-manager-7cb48dbc-26qz6\" (UID: \"6e224af6-7095-4878-ba65-3a8e3f358968\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.309964 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.324291 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7whtd\" (UniqueName: \"kubernetes.io/projected/64d46cbf-e1a4-4673-9f0e-01371175a1f9-kube-api-access-7whtd\") pod \"nova-operator-controller-manager-6c9b57c67-t2wjn\" (UID: \"64d46cbf-e1a4-4673-9f0e-01371175a1f9\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.326009 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.326680 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.338330 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-j2f4q" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.344537 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.365983 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cj8\" (UniqueName: \"kubernetes.io/projected/171d0807-668d-4284-ab63-698401676fbe-kube-api-access-z5cj8\") pod \"octavia-operator-controller-manager-69f59f9d8-sfhzx\" (UID: \"171d0807-668d-4284-ab63-698401676fbe\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.374740 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxvx\" (UniqueName: \"kubernetes.io/projected/0e606b13-be7c-4699-bb4b-5c50ddf32426-kube-api-access-sbxvx\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.394545 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397147 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvd6\" (UniqueName: \"kubernetes.io/projected/86969b11-9037-4890-93dc-575b83669d0f-kube-api-access-lxvd6\") pod \"swift-operator-controller-manager-76d5577b-k6btb\" (UID: \"86969b11-9037-4890-93dc-575b83669d0f\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397209 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hr7\" (UniqueName: \"kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397237 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397429 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397583 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxr79\" (UniqueName: \"kubernetes.io/projected/d8ff35a5-f26c-4077-bdad-baa63159c6e4-kube-api-access-hxr79\") pod \"placement-operator-controller-manager-66f6d6849b-nkfnv\" (UID: \"d8ff35a5-f26c-4077-bdad-baa63159c6e4\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397703 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2tmp\" (UniqueName: \"kubernetes.io/projected/f8ddf44b-e556-40c6-a3f8-699d756434dd-kube-api-access-k2tmp\") pod \"ovn-operator-controller-manager-54d485fd9-2vwpz\" (UID: \"f8ddf44b-e556-40c6-a3f8-699d756434dd\") " pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397923 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.397723 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.414973 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.444882 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.446570 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.447895 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.450819 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pbff6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.458354 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.458616 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.468381 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.469600 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.474184 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-k7tbq" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.481380 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.483542 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hr7\" (UniqueName: \"kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7\") pod \"redhat-operators-xdxcf\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.491313 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2tmp\" (UniqueName: \"kubernetes.io/projected/f8ddf44b-e556-40c6-a3f8-699d756434dd-kube-api-access-k2tmp\") pod \"ovn-operator-controller-manager-54d485fd9-2vwpz\" (UID: \"f8ddf44b-e556-40c6-a3f8-699d756434dd\") " pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.518443 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvd6\" (UniqueName: \"kubernetes.io/projected/86969b11-9037-4890-93dc-575b83669d0f-kube-api-access-lxvd6\") pod \"swift-operator-controller-manager-76d5577b-k6btb\" (UID: \"86969b11-9037-4890-93dc-575b83669d0f\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.518544 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.518552 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzxj\" (UniqueName: \"kubernetes.io/projected/1062e16d-6129-48d2-a385-d988ac5fe4f7-kube-api-access-6lzxj\") pod \"watcher-operator-controller-manager-5d98cc5575-rgw4d\" (UID: \"1062e16d-6129-48d2-a385-d988ac5fe4f7\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.520180 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxr79\" (UniqueName: \"kubernetes.io/projected/d8ff35a5-f26c-4077-bdad-baa63159c6e4-kube-api-access-hxr79\") pod \"placement-operator-controller-manager-66f6d6849b-nkfnv\" (UID: \"d8ff35a5-f26c-4077-bdad-baa63159c6e4\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.520317 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgxqt\" (UniqueName: \"kubernetes.io/projected/cac1fe47-f06a-44fb-b4fe-a19faa802cca-kube-api-access-qgxqt\") pod \"telemetry-operator-controller-manager-f589c7597-sp68w\" (UID: \"cac1fe47-f06a-44fb-b4fe-a19faa802cca\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.540753 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.572094 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxr79\" (UniqueName: \"kubernetes.io/projected/d8ff35a5-f26c-4077-bdad-baa63159c6e4-kube-api-access-hxr79\") pod \"placement-operator-controller-manager-66f6d6849b-nkfnv\" (UID: \"d8ff35a5-f26c-4077-bdad-baa63159c6e4\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.575006 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvd6\" (UniqueName: \"kubernetes.io/projected/86969b11-9037-4890-93dc-575b83669d0f-kube-api-access-lxvd6\") pod \"swift-operator-controller-manager-76d5577b-k6btb\" (UID: \"86969b11-9037-4890-93dc-575b83669d0f\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.602031 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.610487 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.636293 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzxj\" (UniqueName: \"kubernetes.io/projected/1062e16d-6129-48d2-a385-d988ac5fe4f7-kube-api-access-6lzxj\") pod \"watcher-operator-controller-manager-5d98cc5575-rgw4d\" (UID: \"1062e16d-6129-48d2-a385-d988ac5fe4f7\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.636563 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlg4\" (UniqueName: \"kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4\") pod \"test-operator-controller-manager-6bb6dcddc-wrtz7\" (UID: \"4d7dd390-0ad9-42df-9a4f-c8804639fa3f\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.636852 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgxqt\" (UniqueName: \"kubernetes.io/projected/cac1fe47-f06a-44fb-b4fe-a19faa802cca-kube-api-access-qgxqt\") pod \"telemetry-operator-controller-manager-f589c7597-sp68w\" (UID: \"cac1fe47-f06a-44fb-b4fe-a19faa802cca\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.637001 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.655124 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3213b12-9128-4d7c-8ec8-a731e6627de4-cert\") pod \"infra-operator-controller-manager-84788b6bc5-d772s\" (UID: \"e3213b12-9128-4d7c-8ec8-a731e6627de4\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.672247 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgxqt\" (UniqueName: \"kubernetes.io/projected/cac1fe47-f06a-44fb-b4fe-a19faa802cca-kube-api-access-qgxqt\") pod \"telemetry-operator-controller-manager-f589c7597-sp68w\" (UID: \"cac1fe47-f06a-44fb-b4fe-a19faa802cca\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.693375 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzxj\" (UniqueName: \"kubernetes.io/projected/1062e16d-6129-48d2-a385-d988ac5fe4f7-kube-api-access-6lzxj\") pod \"watcher-operator-controller-manager-5d98cc5575-rgw4d\" (UID: \"1062e16d-6129-48d2-a385-d988ac5fe4f7\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.695042 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.714576 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.715855 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.719814 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.720073 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xmvtf" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.730241 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.732959 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.748564 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlg4\" (UniqueName: \"kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4\") pod \"test-operator-controller-manager-6bb6dcddc-wrtz7\" (UID: \"4d7dd390-0ad9-42df-9a4f-c8804639fa3f\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.769521 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.775203 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlg4\" (UniqueName: \"kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4\") pod \"test-operator-controller-manager-6bb6dcddc-wrtz7\" (UID: \"4d7dd390-0ad9-42df-9a4f-c8804639fa3f\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.777272 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.778314 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.780789 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-nznk5" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.791692 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.813017 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.856015 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.857592 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.857652 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98lkv\" (UniqueName: \"kubernetes.io/projected/370cd57f-855c-4584-a0c1-c806f93bd8d7-kube-api-access-98lkv\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.857728 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.857747 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698cl\" (UniqueName: \"kubernetes.io/projected/1f48e97d-5d4f-49d3-b550-d51242109806-kube-api-access-698cl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp\" (UID: \"1f48e97d-5d4f-49d3-b550-d51242109806\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.858970 4959 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.859008 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert podName:0e606b13-be7c-4699-bb4b-5c50ddf32426 nodeName:}" failed. No retries permitted until 2025-10-07 13:14:37.858994921 +0000 UTC m=+830.019717598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert") pod "openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" (UID: "0e606b13-be7c-4699-bb4b-5c50ddf32426") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.867987 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.897084 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g"] Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.897901 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.958519 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698cl\" (UniqueName: \"kubernetes.io/projected/1f48e97d-5d4f-49d3-b550-d51242109806-kube-api-access-698cl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp\" (UID: \"1f48e97d-5d4f-49d3-b550-d51242109806\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.958665 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.958693 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98lkv\" (UniqueName: \"kubernetes.io/projected/370cd57f-855c-4584-a0c1-c806f93bd8d7-kube-api-access-98lkv\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.959305 4959 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: E1007 13:14:36.959364 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert podName:370cd57f-855c-4584-a0c1-c806f93bd8d7 nodeName:}" failed. No retries permitted until 2025-10-07 13:14:37.459348904 +0000 UTC m=+829.620071581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert") pod "openstack-operator-controller-manager-fd79fd9-ktzrv" (UID: "370cd57f-855c-4584-a0c1-c806f93bd8d7") : secret "webhook-server-cert" not found Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.981327 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98lkv\" (UniqueName: \"kubernetes.io/projected/370cd57f-855c-4584-a0c1-c806f93bd8d7-kube-api-access-98lkv\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:36 crc kubenswrapper[4959]: I1007 13:14:36.981983 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698cl\" (UniqueName: \"kubernetes.io/projected/1f48e97d-5d4f-49d3-b550-d51242109806-kube-api-access-698cl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp\" (UID: \"1f48e97d-5d4f-49d3-b550-d51242109806\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.172556 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.405705 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" event={"ID":"4294ed44-d412-4366-959e-cb534ab792bc","Type":"ContainerStarted","Data":"f897d42aca3ec3285ad2b75d63cf59a9dad07aef63d7a946cff67e31a87f93b4"} Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.417535 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" event={"ID":"035c3aeb-396b-47bf-a588-562bb0f27f88","Type":"ContainerStarted","Data":"4c92836014bb0ea5b4ce86054d97dfee96e703713f7ec37e2e9e9c8d09e26c51"} Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.468164 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:37 crc kubenswrapper[4959]: E1007 13:14:37.468312 4959 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 13:14:37 crc kubenswrapper[4959]: E1007 13:14:37.468359 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert podName:370cd57f-855c-4584-a0c1-c806f93bd8d7 nodeName:}" failed. No retries permitted until 2025-10-07 13:14:38.468345755 +0000 UTC m=+830.629068432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert") pod "openstack-operator-controller-manager-fd79fd9-ktzrv" (UID: "370cd57f-855c-4584-a0c1-c806f93bd8d7") : secret "webhook-server-cert" not found Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.498897 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.531072 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc"] Oct 07 13:14:37 crc kubenswrapper[4959]: W1007 13:14:37.539652 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77bcfec2_4667_4415_af5e_3009e5ea4999.slice/crio-bc53ceca6e1cb9ddf5903a616d156c917f77de4a080ab3ad08d152eb12aafac7 WatchSource:0}: Error finding container bc53ceca6e1cb9ddf5903a616d156c917f77de4a080ab3ad08d152eb12aafac7: Status 404 returned error can't find the container with id bc53ceca6e1cb9ddf5903a616d156c917f77de4a080ab3ad08d152eb12aafac7 Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.859941 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.867235 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.873597 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:37 crc kubenswrapper[4959]: W1007 13:14:37.875228 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171d0807_668d_4284_ab63_698401676fbe.slice/crio-df36303fb17128911b71d8d774bfdeeed561fee475794a0cea1b15da3fbf3348 WatchSource:0}: Error finding container df36303fb17128911b71d8d774bfdeeed561fee475794a0cea1b15da3fbf3348: Status 404 returned error can't find the container with id df36303fb17128911b71d8d774bfdeeed561fee475794a0cea1b15da3fbf3348 Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.880418 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e606b13-be7c-4699-bb4b-5c50ddf32426-cert\") pod \"openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw\" (UID: \"0e606b13-be7c-4699-bb4b-5c50ddf32426\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.890368 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.910488 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.912674 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg"] Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.917986 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6"] Oct 07 13:14:37 crc kubenswrapper[4959]: W1007 13:14:37.919817 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea6d6d3_ded0_4788_8901_34c02d659aee.slice/crio-74310ae3da24f6f8378dcf75b26ccd40e84f52a860fb560d3aff2804f161bcf2 WatchSource:0}: Error finding container 74310ae3da24f6f8378dcf75b26ccd40e84f52a860fb560d3aff2804f161bcf2: Status 404 returned error can't find the container with id 74310ae3da24f6f8378dcf75b26ccd40e84f52a860fb560d3aff2804f161bcf2 Oct 07 13:14:37 crc kubenswrapper[4959]: W1007 13:14:37.936442 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b13867_f984_4d88_af12_28fc3ebc0b9f.slice/crio-2f9d3058f36e3dcee2beef01b62d6d256341b3f609be5c8cf279ce43e45bfe73 WatchSource:0}: Error finding container 2f9d3058f36e3dcee2beef01b62d6d256341b3f609be5c8cf279ce43e45bfe73: Status 404 returned error can't find the container with id 2f9d3058f36e3dcee2beef01b62d6d256341b3f609be5c8cf279ce43e45bfe73 Oct 07 13:14:37 crc kubenswrapper[4959]: I1007 13:14:37.987584 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.122763 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.129071 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.135648 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.177325 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f"] Oct 07 13:14:38 crc kubenswrapper[4959]: W1007 13:14:38.179376 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e224af6_7095_4878_ba65_3a8e3f358968.slice/crio-7ad04331406e51fb8e2dadaf08e6e7d299abf95bf3eb1eeee006978924ca8c28 WatchSource:0}: Error finding container 7ad04331406e51fb8e2dadaf08e6e7d299abf95bf3eb1eeee006978924ca8c28: Status 404 returned error can't find the container with id 7ad04331406e51fb8e2dadaf08e6e7d299abf95bf3eb1eeee006978924ca8c28 Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.190287 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.196206 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.203329 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.203997 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:14:38 crc kubenswrapper[4959]: W1007 13:14:38.221900 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749f8ff6_9e1c_45ef_948f_1f8c255b670e.slice/crio-7aab44480a1c8bae478b694d836a7c07c66a083a19d0526af475610313702145 WatchSource:0}: Error finding container 7aab44480a1c8bae478b694d836a7c07c66a083a19d0526af475610313702145: Status 404 returned error can't find the container with id 7aab44480a1c8bae478b694d836a7c07c66a083a19d0526af475610313702145 Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.228306 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.234239 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:14:38 crc kubenswrapper[4959]: W1007 13:14:38.278772 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1062e16d_6129_48d2_a385_d988ac5fe4f7.slice/crio-6e1c020c3f778d6d4aae75333f9d08e6bbb8bf1aafd724780c499cebbe201934 WatchSource:0}: Error finding container 6e1c020c3f778d6d4aae75333f9d08e6bbb8bf1aafd724780c499cebbe201934: Status 404 returned error can't find the container with id 6e1c020c3f778d6d4aae75333f9d08e6bbb8bf1aafd724780c499cebbe201934 Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.314915 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp"] Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.322421 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-k6btb"] Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.340928 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nlg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6bb6dcddc-wrtz7_openstack-operators(4d7dd390-0ad9-42df-9a4f-c8804639fa3f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.346024 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9fv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b477879bc-nf6mt_openstack-operators(539702ff-226a-4c31-b715-af9af8ae1205): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.350946 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vl7sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-84788b6bc5-d772s_openstack-operators(e3213b12-9128-4d7c-8ec8-a731e6627de4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 13:14:38 crc kubenswrapper[4959]: W1007 13:14:38.351264 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f48e97d_5d4f_49d3_b550_d51242109806.slice/crio-7712787d34e753f05ab2cedc8d71ed893e3bc83fc2d9d2f77211ec632bdf1ea8 WatchSource:0}: Error finding container 7712787d34e753f05ab2cedc8d71ed893e3bc83fc2d9d2f77211ec632bdf1ea8: Status 404 returned error can't find the container with id 7712787d34e753f05ab2cedc8d71ed893e3bc83fc2d9d2f77211ec632bdf1ea8 Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.368260 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxvd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-76d5577b-k6btb_openstack-operators(86969b11-9037-4890-93dc-575b83669d0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.368312 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-698cl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp_openstack-operators(1f48e97d-5d4f-49d3-b550-d51242109806): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.369433 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" podUID="1f48e97d-5d4f-49d3-b550-d51242109806" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.429115 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" event={"ID":"86969b11-9037-4890-93dc-575b83669d0f","Type":"ContainerStarted","Data":"cf766224b7aee955a8b0f2744ac754f7d4ad3a0212344f14fa7b950735af128f"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.430214 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" event={"ID":"01b13867-f984-4d88-af12-28fc3ebc0b9f","Type":"ContainerStarted","Data":"2f9d3058f36e3dcee2beef01b62d6d256341b3f609be5c8cf279ce43e45bfe73"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.431164 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" event={"ID":"539702ff-226a-4c31-b715-af9af8ae1205","Type":"ContainerStarted","Data":"060a2d886cd700b585b143fb50decee19e734944341367abcdf0fe23946c9d7b"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.432492 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerStarted","Data":"8c3ac641394a373a5e09d1b344bbbc6ad3bf87e43cd2ed0457960f75fe25155b"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.434182 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" event={"ID":"e3213b12-9128-4d7c-8ec8-a731e6627de4","Type":"ContainerStarted","Data":"86d10854c4a3e8c1abb86a56247397245d5952244c6c7a7322fb0a6c4a3670c6"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.435442 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" event={"ID":"c2a805f1-946a-4b48-9e52-4f24b56bd43a","Type":"ContainerStarted","Data":"5c50f04fb886a696b8979376e76c8738f83f80be53ac4d1ac3de9b70f0c00365"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.436612 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" event={"ID":"f8ddf44b-e556-40c6-a3f8-699d756434dd","Type":"ContainerStarted","Data":"eb3dca02955cd44d0f27d36d93f28c8b9c16476c5907226b8bd5ed073111f199"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.437531 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" event={"ID":"77bcfec2-4667-4415-af5e-3009e5ea4999","Type":"ContainerStarted","Data":"bc53ceca6e1cb9ddf5903a616d156c917f77de4a080ab3ad08d152eb12aafac7"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.439281 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerStarted","Data":"bbeefc611dc8e62e3a2d7891d1352a207535241990ed55df9aa319494a90277b"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.442511 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" event={"ID":"1062e16d-6129-48d2-a385-d988ac5fe4f7","Type":"ContainerStarted","Data":"6e1c020c3f778d6d4aae75333f9d08e6bbb8bf1aafd724780c499cebbe201934"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.453667 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" event={"ID":"6e224af6-7095-4878-ba65-3a8e3f358968","Type":"ContainerStarted","Data":"7ad04331406e51fb8e2dadaf08e6e7d299abf95bf3eb1eeee006978924ca8c28"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.465025 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" event={"ID":"d8ff35a5-f26c-4077-bdad-baa63159c6e4","Type":"ContainerStarted","Data":"8acc23f5936d1030cc7ffbe3b26172ac928c2706f5d2c0190499f5b3f8e5e56c"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.466983 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" event={"ID":"cac1fe47-f06a-44fb-b4fe-a19faa802cca","Type":"ContainerStarted","Data":"4cffb95f84583f1b16c8de749b4b61f13f2b3a0a8f63be0ee7a6c702e4b56a4e"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.468430 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" event={"ID":"64d46cbf-e1a4-4673-9f0e-01371175a1f9","Type":"ContainerStarted","Data":"1c03aaa3bcc960bd8f20c67eee80a587972b528960d5ba6b333e7c78be7dd05c"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.471849 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" event={"ID":"eea6d6d3-ded0-4788-8901-34c02d659aee","Type":"ContainerStarted","Data":"74310ae3da24f6f8378dcf75b26ccd40e84f52a860fb560d3aff2804f161bcf2"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.473424 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" event={"ID":"749f8ff6-9e1c-45ef-948f-1f8c255b670e","Type":"ContainerStarted","Data":"7aab44480a1c8bae478b694d836a7c07c66a083a19d0526af475610313702145"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.474827 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" event={"ID":"1f48e97d-5d4f-49d3-b550-d51242109806","Type":"ContainerStarted","Data":"7712787d34e753f05ab2cedc8d71ed893e3bc83fc2d9d2f77211ec632bdf1ea8"} Oct 07 13:14:38 crc kubenswrapper[4959]: E1007 13:14:38.476254 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" podUID="1f48e97d-5d4f-49d3-b550-d51242109806" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.477920 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" event={"ID":"3e59fb97-6ef4-42a5-a264-506bdccd8a23","Type":"ContainerStarted","Data":"6894e184bf2f2ca5abd4e69e70c42cdaa557c8cf8cfd3f2e2e38af68fefaf06f"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.479238 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" event={"ID":"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85","Type":"ContainerStarted","Data":"c80d01ad517e21ca4e80506ff1e2e9d1cd3ae1930886a2f561ffe652295ef0bd"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.485311 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.486672 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" event={"ID":"171d0807-668d-4284-ab63-698401676fbe","Type":"ContainerStarted","Data":"df36303fb17128911b71d8d774bfdeeed561fee475794a0cea1b15da3fbf3348"} Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.491600 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/370cd57f-855c-4584-a0c1-c806f93bd8d7-cert\") pod \"openstack-operator-controller-manager-fd79fd9-ktzrv\" (UID: \"370cd57f-855c-4584-a0c1-c806f93bd8d7\") " pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.593444 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:38 crc kubenswrapper[4959]: I1007 13:14:38.701948 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw"] Oct 07 13:14:38 crc kubenswrapper[4959]: W1007 13:14:38.746753 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e606b13_be7c_4699_bb4b_5c50ddf32426.slice/crio-d0fdfa504642ebf72dd86219fa9582c7f9b2260e1c80ea95875737f7d3c8741a WatchSource:0}: Error finding container d0fdfa504642ebf72dd86219fa9582c7f9b2260e1c80ea95875737f7d3c8741a: Status 404 returned error can't find the container with id d0fdfa504642ebf72dd86219fa9582c7f9b2260e1c80ea95875737f7d3c8741a Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.083971 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" podUID="86969b11-9037-4890-93dc-575b83669d0f" Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.084326 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" podUID="539702ff-226a-4c31-b715-af9af8ae1205" Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.085519 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.090170 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" podUID="e3213b12-9128-4d7c-8ec8-a731e6627de4" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.255931 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv"] Oct 07 13:14:39 crc kubenswrapper[4959]: W1007 13:14:39.269670 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod370cd57f_855c_4584_a0c1_c806f93bd8d7.slice/crio-91df9921530d54918a2066572b76b051ea3a86b0c02733eaa72b0bfff35a2a0e WatchSource:0}: Error finding container 91df9921530d54918a2066572b76b051ea3a86b0c02733eaa72b0bfff35a2a0e: Status 404 returned error can't find the container with id 91df9921530d54918a2066572b76b051ea3a86b0c02733eaa72b0bfff35a2a0e Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.499123 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" event={"ID":"370cd57f-855c-4584-a0c1-c806f93bd8d7","Type":"ContainerStarted","Data":"91df9921530d54918a2066572b76b051ea3a86b0c02733eaa72b0bfff35a2a0e"} Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.501954 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerStarted","Data":"6a154f9a5e1a23a1a6ce823732131b4e151887b6ee787e1271339e73aaedcfb8"} Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.504269 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.504346 4959 generic.go:334] "Generic (PLEG): container finished" podID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerID="a8ae1839419c535402300ddf83cfc3f2557c3c46fbdf201020b79f3f64ef97d8" exitCode=0 Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.504450 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerDied","Data":"a8ae1839419c535402300ddf83cfc3f2557c3c46fbdf201020b79f3f64ef97d8"} Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.508341 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" event={"ID":"86969b11-9037-4890-93dc-575b83669d0f","Type":"ContainerStarted","Data":"de2d8296b7f2b0e98097c42417e9acb9808349a002b5d970c8c01eff772ee087"} Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.521967 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" podUID="86969b11-9037-4890-93dc-575b83669d0f" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.532923 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" event={"ID":"0e606b13-be7c-4699-bb4b-5c50ddf32426","Type":"ContainerStarted","Data":"d0fdfa504642ebf72dd86219fa9582c7f9b2260e1c80ea95875737f7d3c8741a"} Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.547225 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" event={"ID":"e3213b12-9128-4d7c-8ec8-a731e6627de4","Type":"ContainerStarted","Data":"62aa3701d67cef848d2a1bb278a7646f3f342e48d68f1868d4a138e33d49b809"} Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.552688 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" podUID="e3213b12-9128-4d7c-8ec8-a731e6627de4" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.561787 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" event={"ID":"539702ff-226a-4c31-b715-af9af8ae1205","Type":"ContainerStarted","Data":"04e4b44dbc4e95c3cd073468577231bee4298f6ef71de6422e9810f7aad8746e"} Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.565604 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" podUID="1f48e97d-5d4f-49d3-b550-d51242109806" Oct 07 13:14:39 crc kubenswrapper[4959]: E1007 13:14:39.566829 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" podUID="539702ff-226a-4c31-b715-af9af8ae1205" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.921110 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:39 crc kubenswrapper[4959]: I1007 13:14:39.921146 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:40 crc kubenswrapper[4959]: I1007 13:14:40.012856 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:40 crc kubenswrapper[4959]: I1007 13:14:40.584718 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" event={"ID":"370cd57f-855c-4584-a0c1-c806f93bd8d7","Type":"ContainerStarted","Data":"e3cf3aeebcfbc7f87828644cccbc1044123f8da5d326f7962773eb79bfc80a7a"} Oct 07 13:14:40 crc kubenswrapper[4959]: E1007 13:14:40.588456 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:b6ab8fc3ad425eca2e073fe9ba9d5b29d9ea4d9814de7bb799fa330209566cd4\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" podUID="539702ff-226a-4c31-b715-af9af8ae1205" Oct 07 13:14:40 crc kubenswrapper[4959]: E1007 13:14:40.591821 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" podUID="e3213b12-9128-4d7c-8ec8-a731e6627de4" Oct 07 13:14:40 crc kubenswrapper[4959]: E1007 13:14:40.592009 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" Oct 07 13:14:40 crc kubenswrapper[4959]: E1007 13:14:40.592945 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" podUID="86969b11-9037-4890-93dc-575b83669d0f" Oct 07 13:14:40 crc kubenswrapper[4959]: I1007 13:14:40.674610 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:42 crc kubenswrapper[4959]: I1007 13:14:42.154414 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:14:42 crc kubenswrapper[4959]: I1007 13:14:42.596808 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgd79" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="registry-server" containerID="cri-o://49b99173b6709c64e110ae83c5c64570e4187f648dd49c66d0783929e6580bb5" gracePeriod=2 Oct 07 13:14:43 crc kubenswrapper[4959]: I1007 13:14:43.603894 4959 generic.go:334] "Generic (PLEG): container finished" podID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerID="49b99173b6709c64e110ae83c5c64570e4187f648dd49c66d0783929e6580bb5" exitCode=0 Oct 07 13:14:43 crc kubenswrapper[4959]: I1007 13:14:43.603938 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerDied","Data":"49b99173b6709c64e110ae83c5c64570e4187f648dd49c66d0783929e6580bb5"} Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.770042 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.772563 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.779369 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.919878 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.920027 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:45 crc kubenswrapper[4959]: I1007 13:14:45.920199 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zg56\" (UniqueName: \"kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.021842 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.021950 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zg56\" (UniqueName: \"kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.021984 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.022463 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.022701 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.047617 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zg56\" (UniqueName: \"kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56\") pod \"redhat-marketplace-scq9z\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:46 crc kubenswrapper[4959]: I1007 13:14:46.089266 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.844410 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.979315 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content\") pod \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.979737 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbpx\" (UniqueName: \"kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx\") pod \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.979827 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities\") pod \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\" (UID: \"a179b1bb-17fb-4ba8-80f9-0741c9b49c04\") " Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.980959 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities" (OuterVolumeSpecName: "utilities") pod "a179b1bb-17fb-4ba8-80f9-0741c9b49c04" (UID: "a179b1bb-17fb-4ba8-80f9-0741c9b49c04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:14:49 crc kubenswrapper[4959]: I1007 13:14:49.993123 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx" (OuterVolumeSpecName: "kube-api-access-5cbpx") pod "a179b1bb-17fb-4ba8-80f9-0741c9b49c04" (UID: "a179b1bb-17fb-4ba8-80f9-0741c9b49c04"). InnerVolumeSpecName "kube-api-access-5cbpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.062597 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a179b1bb-17fb-4ba8-80f9-0741c9b49c04" (UID: "a179b1bb-17fb-4ba8-80f9-0741c9b49c04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.069295 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.081515 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.081543 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.081557 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbpx\" (UniqueName: \"kubernetes.io/projected/a179b1bb-17fb-4ba8-80f9-0741c9b49c04-kube-api-access-5cbpx\") on node \"crc\" DevicePath \"\"" Oct 07 13:14:50 crc kubenswrapper[4959]: W1007 13:14:50.199370 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode612f4ce_1e8d_4937_8d89_8424cfe7b066.slice/crio-85058f8b0b13a9b3ef506234739efa962925575699c48c9b3dbe7a2ba2eeed1a WatchSource:0}: Error finding container 85058f8b0b13a9b3ef506234739efa962925575699c48c9b3dbe7a2ba2eeed1a: Status 404 returned error can't find the container with id 85058f8b0b13a9b3ef506234739efa962925575699c48c9b3dbe7a2ba2eeed1a Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.673048 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" event={"ID":"749f8ff6-9e1c-45ef-948f-1f8c255b670e","Type":"ContainerStarted","Data":"fdc6ddbfdac3e11d2ba9bda4723ecb8af82b3d5d8bf7d89f6600f41931081c05"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.680499 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerStarted","Data":"85058f8b0b13a9b3ef506234739efa962925575699c48c9b3dbe7a2ba2eeed1a"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.696164 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" event={"ID":"3e59fb97-6ef4-42a5-a264-506bdccd8a23","Type":"ContainerStarted","Data":"84ca34ba6c9e9460449cefb09cca250dd9122db81b1af9cff1b42349dc7bda40"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.698009 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" event={"ID":"cac1fe47-f06a-44fb-b4fe-a19faa802cca","Type":"ContainerStarted","Data":"9d11512543da276cec9719c2d89dfce835bc08e51c2efedde0c3a14ae086dafb"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.717361 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" event={"ID":"4294ed44-d412-4366-959e-cb534ab792bc","Type":"ContainerStarted","Data":"2f5f13d37cc8e7d048685c6e8422731bf2f90d39204a37abc2b1d7ff05084b9c"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.742139 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" event={"ID":"0e606b13-be7c-4699-bb4b-5c50ddf32426","Type":"ContainerStarted","Data":"a0310c8f1e66dac0d1c9d6586bab3bbdd346ee5eb77bfa76df061ba60bc33052"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.753051 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" event={"ID":"1062e16d-6129-48d2-a385-d988ac5fe4f7","Type":"ContainerStarted","Data":"ce3ecf2fb60a608f4fea8290e36d050e36c91737c4727135aaf1abb310d8a9d8"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.754408 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" event={"ID":"d8ff35a5-f26c-4077-bdad-baa63159c6e4","Type":"ContainerStarted","Data":"8b3f3ee537008188b8225a12c9f85d851868b6ef818833492860f1179b0bfb67"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.784871 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" event={"ID":"f8ddf44b-e556-40c6-a3f8-699d756434dd","Type":"ContainerStarted","Data":"6fb7e1ab5f1db57cb79e99c00672e2a1e83aaffb325e8ff8b6d8db2c8ee6a19a"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.790885 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" event={"ID":"01b13867-f984-4d88-af12-28fc3ebc0b9f","Type":"ContainerStarted","Data":"28589c4c935788137b31276cadf631f0aa1d2a21163ef0eb0b6a4a7c837cf2fc"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.799538 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.799522 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgd79" event={"ID":"a179b1bb-17fb-4ba8-80f9-0741c9b49c04","Type":"ContainerDied","Data":"e797f489ca9aa8056041d492da526ee3d781ab57ca3428adb4ac5cf20422c824"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.799872 4959 scope.go:117] "RemoveContainer" containerID="49b99173b6709c64e110ae83c5c64570e4187f648dd49c66d0783929e6580bb5" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.801715 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" event={"ID":"035c3aeb-396b-47bf-a588-562bb0f27f88","Type":"ContainerStarted","Data":"efa862e95e04502a26a824b63bf235a585ce2853fef1651adbb60a6804c9dd24"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.804112 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" event={"ID":"370cd57f-855c-4584-a0c1-c806f93bd8d7","Type":"ContainerStarted","Data":"9f1a347594f4d371096b4cfa99c59d832271ea87d91f7205d6d6bda6ec7553f0"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.804237 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.806900 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" event={"ID":"77bcfec2-4667-4415-af5e-3009e5ea4999","Type":"ContainerStarted","Data":"be60771a30d463a92bb28211cc1293a861c769bcf4c9308008370fa54403dc80"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.825794 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.825826 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerStarted","Data":"e63945291aa27df10571b1e69fe74c94ae383a28921efeea4a2fefc7d400c590"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.828138 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" event={"ID":"171d0807-668d-4284-ab63-698401676fbe","Type":"ContainerStarted","Data":"da18010d86e6f838f677d7631015e958157ae81f30fcc5045a8dd32088f29905"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.829688 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" event={"ID":"64d46cbf-e1a4-4673-9f0e-01371175a1f9","Type":"ContainerStarted","Data":"020618316a7078b3c3c4a032c691143bfb4706cfecc948f3ccc0df62f1517ca4"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.834039 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" event={"ID":"eea6d6d3-ded0-4788-8901-34c02d659aee","Type":"ContainerStarted","Data":"cdbb63b93b277293c4aba2b22568ac771d1b25efe9f1b6e4a39d007dc0adfb05"} Oct 07 13:14:50 crc kubenswrapper[4959]: I1007 13:14:50.847940 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-fd79fd9-ktzrv" podStartSLOduration=14.847917596 podStartE2EDuration="14.847917596s" podCreationTimestamp="2025-10-07 13:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:14:50.841647255 +0000 UTC m=+843.002369932" watchObservedRunningTime="2025-10-07 13:14:50.847917596 +0000 UTC m=+843.008640273" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.187954 4959 scope.go:117] "RemoveContainer" containerID="72c8031d2f7d38267bb15a9529b702fcd570b6518eb660935a4b15530fc9e1b5" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.246042 4959 scope.go:117] "RemoveContainer" containerID="27e07a6777033a2945a8f429311cf233b757be5a9a62fab38bc77ea6e3558fee" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.842388 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" event={"ID":"77bcfec2-4667-4415-af5e-3009e5ea4999","Type":"ContainerStarted","Data":"f17e6264ec815b8349eb95202381ba6981a364c48463b7c8c114ce1a26374d10"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.842786 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.844082 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" event={"ID":"0e606b13-be7c-4699-bb4b-5c50ddf32426","Type":"ContainerStarted","Data":"005e70f8ed37b0e6ceff43f1ebac59bc7f277f45044fc8c53b84b83a1871c874"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.844249 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.845565 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" event={"ID":"1062e16d-6129-48d2-a385-d988ac5fe4f7","Type":"ContainerStarted","Data":"d3dc899b74e128b10d3e293bd2b4bd2c1545102925bf15840d3f5c0d6136dea2"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.845664 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.853651 4959 generic.go:334] "Generic (PLEG): container finished" podID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerID="a2e76c5193151e1c072d53649a037e84ec917965d706c7d27e06f0737979b6b5" exitCode=0 Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.853717 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerDied","Data":"a2e76c5193151e1c072d53649a037e84ec917965d706c7d27e06f0737979b6b5"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.856822 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" event={"ID":"3e59fb97-6ef4-42a5-a264-506bdccd8a23","Type":"ContainerStarted","Data":"f5a537610cdbabb3a1968e09c90903e4b88534ff6bd4822e28486efa6af7d8e8"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.857217 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.866025 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" event={"ID":"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85","Type":"ContainerStarted","Data":"cfc81a31f9d4edb4af23f61d9aacfc3ecadcb18ad389c495a1d4dac90c405466"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.866071 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" event={"ID":"5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85","Type":"ContainerStarted","Data":"c2fb6f05322d28f87a127b08c18ad057a0de98d88941e182f5bc797660f3a1bf"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.866590 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.866920 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" podStartSLOduration=4.746505538 podStartE2EDuration="16.866911886s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.547485796 +0000 UTC m=+829.708208473" lastFinishedPulling="2025-10-07 13:14:49.667892144 +0000 UTC m=+841.828614821" observedRunningTime="2025-10-07 13:14:51.865436344 +0000 UTC m=+844.026159031" watchObservedRunningTime="2025-10-07 13:14:51.866911886 +0000 UTC m=+844.027634563" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.875896 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" event={"ID":"64d46cbf-e1a4-4673-9f0e-01371175a1f9","Type":"ContainerStarted","Data":"f8ad989b41b628cb84125feff61b1358774117ed65213ec3b145228ca99b7668"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.876729 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.878289 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" event={"ID":"eea6d6d3-ded0-4788-8901-34c02d659aee","Type":"ContainerStarted","Data":"377f6114821cc0cf3c3e9b9941d2adfb323b036a9135128dfd051a90c8ad160e"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.878846 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.882303 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" event={"ID":"cac1fe47-f06a-44fb-b4fe-a19faa802cca","Type":"ContainerStarted","Data":"50ed2b28003348136a660bade7f37ff9c87ef41b8f00e785b3778fef077cdfb2"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.882368 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.883969 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" event={"ID":"171d0807-668d-4284-ab63-698401676fbe","Type":"ContainerStarted","Data":"e878cca89adb5caca1913dae5419d57393ccac85f82cb76f9dfce6d26448ca7c"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.884428 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.890138 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" event={"ID":"035c3aeb-396b-47bf-a588-562bb0f27f88","Type":"ContainerStarted","Data":"2b3ccc379759b31fcde84fd05d6e88b23317f9bd976bc0ecc0d94b751fb8c3d0"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.890258 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.894500 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" podStartSLOduration=6.006777352 podStartE2EDuration="16.894484601s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.752847477 +0000 UTC m=+830.913570154" lastFinishedPulling="2025-10-07 13:14:49.640554726 +0000 UTC m=+841.801277403" observedRunningTime="2025-10-07 13:14:51.890336111 +0000 UTC m=+844.051058788" watchObservedRunningTime="2025-10-07 13:14:51.894484601 +0000 UTC m=+844.055207278" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.894843 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" event={"ID":"01b13867-f984-4d88-af12-28fc3ebc0b9f","Type":"ContainerStarted","Data":"f7fff95c33da2e52589803b57165085d0d3faa3a38d80f298bf92c79b6e4edeb"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.895436 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.901567 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" event={"ID":"6e224af6-7095-4878-ba65-3a8e3f358968","Type":"ContainerStarted","Data":"5ceeeb15102dab28232250c210cd3f0153199d98505e39bc18454bc9ec9af2d2"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.901590 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" event={"ID":"6e224af6-7095-4878-ba65-3a8e3f358968","Type":"ContainerStarted","Data":"d7bc5eac7a5e640a60316bed2d9ce6455a824ffcd7754c9482f07d90121a39ac"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.901657 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.907288 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" event={"ID":"d8ff35a5-f26c-4077-bdad-baa63159c6e4","Type":"ContainerStarted","Data":"e2c3ebdd5c8e5ab972e80617ce258f09dbbf2c297c55c27de80103fb92b0e7d7"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.907736 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.909412 4959 generic.go:334] "Generic (PLEG): container finished" podID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerID="e63945291aa27df10571b1e69fe74c94ae383a28921efeea4a2fefc7d400c590" exitCode=0 Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.909471 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerDied","Data":"e63945291aa27df10571b1e69fe74c94ae383a28921efeea4a2fefc7d400c590"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.946492 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" event={"ID":"4294ed44-d412-4366-959e-cb534ab792bc","Type":"ContainerStarted","Data":"30ced06e8b75dc42fde300364c81e994c390f68a696c8aabfc042b070f568a3d"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.946676 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.955360 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" podStartSLOduration=5.184123961 podStartE2EDuration="16.955344095s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.890553334 +0000 UTC m=+830.051276011" lastFinishedPulling="2025-10-07 13:14:49.661773478 +0000 UTC m=+841.822496145" observedRunningTime="2025-10-07 13:14:51.947705205 +0000 UTC m=+844.108427882" watchObservedRunningTime="2025-10-07 13:14:51.955344095 +0000 UTC m=+844.116066772" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.955805 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" event={"ID":"c2a805f1-946a-4b48-9e52-4f24b56bd43a","Type":"ContainerStarted","Data":"3004e4cfccb3bdec3b72b88d6d20588120f26315bba4ef6ae587900718a3c006"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.955836 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" event={"ID":"c2a805f1-946a-4b48-9e52-4f24b56bd43a","Type":"ContainerStarted","Data":"170774cba8d7f0d07f47e77418774788d6ec498ebc72ee0b4d58a9c4fbe3e1be"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.956019 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.961963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" event={"ID":"f8ddf44b-e556-40c6-a3f8-699d756434dd","Type":"ContainerStarted","Data":"4a9d69bf326d91da7d1a3a4a99974fffc7ee8613a2046066455c42aa4d59bb94"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.962415 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.973194 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" event={"ID":"749f8ff6-9e1c-45ef-948f-1f8c255b670e","Type":"ContainerStarted","Data":"9ad1095fd96bbb823e665bca03b702362eca21af8032ec6e1da95be1c0b52860"} Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.973231 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:51 crc kubenswrapper[4959]: I1007 13:14:51.995329 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" podStartSLOduration=4.616988038 podStartE2EDuration="15.995313567s" podCreationTimestamp="2025-10-07 13:14:36 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.283289644 +0000 UTC m=+830.444012321" lastFinishedPulling="2025-10-07 13:14:49.661615173 +0000 UTC m=+841.822337850" observedRunningTime="2025-10-07 13:14:51.991545959 +0000 UTC m=+844.152268636" watchObservedRunningTime="2025-10-07 13:14:51.995313567 +0000 UTC m=+844.156036244" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.023488 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" podStartSLOduration=5.276967668 podStartE2EDuration="17.023474959s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.915597216 +0000 UTC m=+830.076319893" lastFinishedPulling="2025-10-07 13:14:49.662104507 +0000 UTC m=+841.822827184" observedRunningTime="2025-10-07 13:14:52.019843324 +0000 UTC m=+844.180566001" watchObservedRunningTime="2025-10-07 13:14:52.023474959 +0000 UTC m=+844.184197636" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.040173 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" podStartSLOduration=5.334201328 podStartE2EDuration="17.04015682s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.93481309 +0000 UTC m=+830.095535767" lastFinishedPulling="2025-10-07 13:14:49.640768582 +0000 UTC m=+841.801491259" observedRunningTime="2025-10-07 13:14:52.037561575 +0000 UTC m=+844.198284252" watchObservedRunningTime="2025-10-07 13:14:52.04015682 +0000 UTC m=+844.200879497" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.073776 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" podStartSLOduration=4.920804812 podStartE2EDuration="17.073760408s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.510219942 +0000 UTC m=+829.670942619" lastFinishedPulling="2025-10-07 13:14:49.663175538 +0000 UTC m=+841.823898215" observedRunningTime="2025-10-07 13:14:52.067546059 +0000 UTC m=+844.228268736" watchObservedRunningTime="2025-10-07 13:14:52.073760408 +0000 UTC m=+844.234483075" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.089102 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" podStartSLOduration=4.656337812 podStartE2EDuration="16.08908378s" podCreationTimestamp="2025-10-07 13:14:36 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.207460198 +0000 UTC m=+830.368182875" lastFinishedPulling="2025-10-07 13:14:49.640206166 +0000 UTC m=+841.800928843" observedRunningTime="2025-10-07 13:14:52.085228529 +0000 UTC m=+844.245951206" watchObservedRunningTime="2025-10-07 13:14:52.08908378 +0000 UTC m=+844.249806457" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.106840 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" podStartSLOduration=5.409103216 podStartE2EDuration="17.106825201s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.96430621 +0000 UTC m=+830.125028887" lastFinishedPulling="2025-10-07 13:14:49.662028195 +0000 UTC m=+841.822750872" observedRunningTime="2025-10-07 13:14:52.10122276 +0000 UTC m=+844.261945437" watchObservedRunningTime="2025-10-07 13:14:52.106825201 +0000 UTC m=+844.267547878" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.150720 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" podStartSLOduration=5.827601208 podStartE2EDuration="17.150703026s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.338827164 +0000 UTC m=+830.499549831" lastFinishedPulling="2025-10-07 13:14:49.661928972 +0000 UTC m=+841.822651649" observedRunningTime="2025-10-07 13:14:52.127118676 +0000 UTC m=+844.287841373" watchObservedRunningTime="2025-10-07 13:14:52.150703026 +0000 UTC m=+844.311425703" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.152459 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" podStartSLOduration=5.752491013 podStartE2EDuration="17.152453176s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.242286532 +0000 UTC m=+830.403009209" lastFinishedPulling="2025-10-07 13:14:49.642248695 +0000 UTC m=+841.802971372" observedRunningTime="2025-10-07 13:14:52.149091429 +0000 UTC m=+844.309814106" watchObservedRunningTime="2025-10-07 13:14:52.152453176 +0000 UTC m=+844.313175853" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.176463 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" podStartSLOduration=5.744328368 podStartE2EDuration="17.176447348s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.229892305 +0000 UTC m=+830.390614982" lastFinishedPulling="2025-10-07 13:14:49.662011285 +0000 UTC m=+841.822733962" observedRunningTime="2025-10-07 13:14:52.170724703 +0000 UTC m=+844.331447380" watchObservedRunningTime="2025-10-07 13:14:52.176447348 +0000 UTC m=+844.337170025" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.218058 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" podStartSLOduration=5.494171778 podStartE2EDuration="17.218044837s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.918023076 +0000 UTC m=+830.078745753" lastFinishedPulling="2025-10-07 13:14:49.641896125 +0000 UTC m=+841.802618812" observedRunningTime="2025-10-07 13:14:52.215693309 +0000 UTC m=+844.376415986" watchObservedRunningTime="2025-10-07 13:14:52.218044837 +0000 UTC m=+844.378767514" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.245961 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" podStartSLOduration=4.502377441 podStartE2EDuration="17.245943411s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:36.897205843 +0000 UTC m=+829.057928520" lastFinishedPulling="2025-10-07 13:14:49.640771813 +0000 UTC m=+841.801494490" observedRunningTime="2025-10-07 13:14:52.238518517 +0000 UTC m=+844.399241194" watchObservedRunningTime="2025-10-07 13:14:52.245943411 +0000 UTC m=+844.406666088" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.269555 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" podStartSLOduration=4.56514347 podStartE2EDuration="17.269539121s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:36.955919495 +0000 UTC m=+829.116642172" lastFinishedPulling="2025-10-07 13:14:49.660315146 +0000 UTC m=+841.821037823" observedRunningTime="2025-10-07 13:14:52.264406563 +0000 UTC m=+844.425129240" watchObservedRunningTime="2025-10-07 13:14:52.269539121 +0000 UTC m=+844.430261788" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.283953 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" podStartSLOduration=5.797920982 podStartE2EDuration="17.283940156s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.182110997 +0000 UTC m=+830.342833674" lastFinishedPulling="2025-10-07 13:14:49.668130171 +0000 UTC m=+841.828852848" observedRunningTime="2025-10-07 13:14:52.281350672 +0000 UTC m=+844.442073339" watchObservedRunningTime="2025-10-07 13:14:52.283940156 +0000 UTC m=+844.444662833" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.299377 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" podStartSLOduration=5.546453095 podStartE2EDuration="17.299364231s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:37.887422584 +0000 UTC m=+830.048145261" lastFinishedPulling="2025-10-07 13:14:49.64033372 +0000 UTC m=+841.801056397" observedRunningTime="2025-10-07 13:14:52.297159637 +0000 UTC m=+844.457882314" watchObservedRunningTime="2025-10-07 13:14:52.299364231 +0000 UTC m=+844.460086908" Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.982257 4959 generic.go:334] "Generic (PLEG): container finished" podID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerID="72b1bca13cd43e908f0a54217fb2844fa5e84b8b2c44aaf09b58213976e2e903" exitCode=0 Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.982611 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerDied","Data":"72b1bca13cd43e908f0a54217fb2844fa5e84b8b2c44aaf09b58213976e2e903"} Oct 07 13:14:52 crc kubenswrapper[4959]: I1007 13:14:52.988385 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerStarted","Data":"759f421227ac5555dc5835a9a89677e94fc274298c24c208241a2f79a2132ba9"} Oct 07 13:14:53 crc kubenswrapper[4959]: I1007 13:14:53.023136 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdxcf" podStartSLOduration=6.819258387 podStartE2EDuration="18.02311982s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:41.142331998 +0000 UTC m=+833.303054675" lastFinishedPulling="2025-10-07 13:14:52.346193431 +0000 UTC m=+844.506916108" observedRunningTime="2025-10-07 13:14:53.015285755 +0000 UTC m=+845.176008452" watchObservedRunningTime="2025-10-07 13:14:53.02311982 +0000 UTC m=+845.183842487" Oct 07 13:14:53 crc kubenswrapper[4959]: I1007 13:14:53.998465 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerStarted","Data":"32ce376a46b60140db97aed54f305a24920afe8002fd752877ce472d9d81cea7"} Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.000250 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" event={"ID":"e3213b12-9128-4d7c-8ec8-a731e6627de4","Type":"ContainerStarted","Data":"2039a091bb354056d8a52beae1465633cad6a6c3224c2bcea9d4942610896c46"} Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.000671 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.004078 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" event={"ID":"1f48e97d-5d4f-49d3-b550-d51242109806","Type":"ContainerStarted","Data":"0ce8a4acfd0fb5780644677657cfb40c2e262ae5d72ef66db5a6841229d9ee05"} Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.008658 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-nkfnv" Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.009101 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-t2wjn" Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.021475 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-scq9z" podStartSLOduration=7.024992372 podStartE2EDuration="9.021457095s" podCreationTimestamp="2025-10-07 13:14:45 +0000 UTC" firstStartedPulling="2025-10-07 13:14:51.854697834 +0000 UTC m=+844.015420511" lastFinishedPulling="2025-10-07 13:14:53.851162537 +0000 UTC m=+846.011885234" observedRunningTime="2025-10-07 13:14:54.016250145 +0000 UTC m=+846.176972832" watchObservedRunningTime="2025-10-07 13:14:54.021457095 +0000 UTC m=+846.182179772" Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.036884 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" podStartSLOduration=3.668034973 podStartE2EDuration="19.036865729s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.350772219 +0000 UTC m=+830.511494896" lastFinishedPulling="2025-10-07 13:14:53.719602975 +0000 UTC m=+845.880325652" observedRunningTime="2025-10-07 13:14:54.033086901 +0000 UTC m=+846.193809578" watchObservedRunningTime="2025-10-07 13:14:54.036865729 +0000 UTC m=+846.197588406" Oct 07 13:14:54 crc kubenswrapper[4959]: I1007 13:14:54.102077 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp" podStartSLOduration=2.756325878 podStartE2EDuration="18.102059889s" podCreationTimestamp="2025-10-07 13:14:36 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.368104038 +0000 UTC m=+830.528826715" lastFinishedPulling="2025-10-07 13:14:53.713838049 +0000 UTC m=+845.874560726" observedRunningTime="2025-10-07 13:14:54.0805911 +0000 UTC m=+846.241313807" watchObservedRunningTime="2025-10-07 13:14:54.102059889 +0000 UTC m=+846.262782556" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.055995 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-fd648f65-rmk5h" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.077978 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7ccfc8cf49-d4g6g" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.089926 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.089976 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.161221 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.169355 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5b84cc7657-r57lc" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.293072 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f56ff694-b4rhk" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.297445 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-zl4v9" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.332324 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-6tcd6" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.336995 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-xmpkg" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.451236 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5467f8988c-6t98f" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.469494 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-sfhzx" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.519128 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.519293 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.545490 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-6kq7p" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.605652 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-26qz6" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.615008 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54d485fd9-2vwpz" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.817541 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-sp68w" Oct 07 13:14:56 crc kubenswrapper[4959]: I1007 13:14:56.926012 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-rgw4d" Oct 07 13:14:57 crc kubenswrapper[4959]: I1007 13:14:57.560290 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdxcf" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="registry-server" probeResult="failure" output=< Oct 07 13:14:57 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 13:14:57 crc kubenswrapper[4959]: > Oct 07 13:14:57 crc kubenswrapper[4959]: I1007 13:14:57.993561 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw" Oct 07 13:14:58 crc kubenswrapper[4959]: I1007 13:14:58.049936 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" event={"ID":"86969b11-9037-4890-93dc-575b83669d0f","Type":"ContainerStarted","Data":"bd22cdf1dac929776c513d6540880b8743217a9d85ffca94689c819db1f21f17"} Oct 07 13:14:58 crc kubenswrapper[4959]: I1007 13:14:58.050946 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:14:58 crc kubenswrapper[4959]: I1007 13:14:58.103128 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" podStartSLOduration=4.699716108 podStartE2EDuration="23.103113859s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.368110088 +0000 UTC m=+830.528832765" lastFinishedPulling="2025-10-07 13:14:56.771507839 +0000 UTC m=+848.932230516" observedRunningTime="2025-10-07 13:14:58.097037584 +0000 UTC m=+850.257760261" watchObservedRunningTime="2025-10-07 13:14:58.103113859 +0000 UTC m=+850.263836536" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.057416 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerStarted","Data":"d3a5b7268c2a2307dc1738fa93a8f05f23fe47f3e013504c9957be3f0c64e9c0"} Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.057758 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.059073 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" event={"ID":"539702ff-226a-4c31-b715-af9af8ae1205","Type":"ContainerStarted","Data":"a1476feae9a4a536c9350288734582ddb8caa6588d6c15f05c78b226db86441a"} Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.059291 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.072281 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podStartSLOduration=2.734758246 podStartE2EDuration="23.072268353s" podCreationTimestamp="2025-10-07 13:14:36 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.340794591 +0000 UTC m=+830.501517268" lastFinishedPulling="2025-10-07 13:14:58.678304698 +0000 UTC m=+850.839027375" observedRunningTime="2025-10-07 13:14:59.069853893 +0000 UTC m=+851.230576570" watchObservedRunningTime="2025-10-07 13:14:59.072268353 +0000 UTC m=+851.232991030" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.083826 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" podStartSLOduration=3.74564276 podStartE2EDuration="24.083812806s" podCreationTimestamp="2025-10-07 13:14:35 +0000 UTC" firstStartedPulling="2025-10-07 13:14:38.345869887 +0000 UTC m=+830.506592564" lastFinishedPulling="2025-10-07 13:14:58.684039943 +0000 UTC m=+850.844762610" observedRunningTime="2025-10-07 13:14:59.083097455 +0000 UTC m=+851.243820132" watchObservedRunningTime="2025-10-07 13:14:59.083812806 +0000 UTC m=+851.244535483" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.216263 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:14:59 crc kubenswrapper[4959]: E1007 13:14:59.216799 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="extract-utilities" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.216893 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="extract-utilities" Oct 07 13:14:59 crc kubenswrapper[4959]: E1007 13:14:59.216966 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="registry-server" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.217025 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="registry-server" Oct 07 13:14:59 crc kubenswrapper[4959]: E1007 13:14:59.217098 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="extract-content" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.217147 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="extract-content" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.217396 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" containerName="registry-server" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.228982 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.247066 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.362772 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.363372 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g22vn\" (UniqueName: \"kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.363726 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.465311 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.465604 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g22vn\" (UniqueName: \"kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.465657 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.465895 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.466109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.494780 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g22vn\" (UniqueName: \"kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn\") pod \"community-operators-lqlrs\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.553758 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:14:59 crc kubenswrapper[4959]: I1007 13:14:59.983497 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:14:59 crc kubenswrapper[4959]: W1007 13:14:59.988028 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7445413c_7e73_465e_9eae_224ca4abed07.slice/crio-7f48c35382b1aab2326b7f6cfce8ed600b09b40cf0594c4e736b29f3d29be139 WatchSource:0}: Error finding container 7f48c35382b1aab2326b7f6cfce8ed600b09b40cf0594c4e736b29f3d29be139: Status 404 returned error can't find the container with id 7f48c35382b1aab2326b7f6cfce8ed600b09b40cf0594c4e736b29f3d29be139 Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.067109 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerStarted","Data":"7f48c35382b1aab2326b7f6cfce8ed600b09b40cf0594c4e736b29f3d29be139"} Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.137272 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk"] Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.138193 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.140702 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.141020 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.149500 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk"] Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.277617 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.277759 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.277841 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vvz6\" (UniqueName: \"kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.379098 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vvz6\" (UniqueName: \"kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.379170 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.379200 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.380099 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.384333 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.394826 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vvz6\" (UniqueName: \"kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6\") pod \"collect-profiles-29330715-8vtlk\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:00 crc kubenswrapper[4959]: I1007 13:15:00.463974 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:02 crc kubenswrapper[4959]: I1007 13:15:02.802537 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk"] Oct 07 13:15:03 crc kubenswrapper[4959]: I1007 13:15:03.091442 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" event={"ID":"aa5d4db1-1025-48c0-850d-54ac32c93f1f","Type":"ContainerStarted","Data":"0517de98b838be148c457be9235110b9d13dfb945a17f1267256bb5b46b085ac"} Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.158934 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.200034 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.398695 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b477879bc-nf6mt" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.586131 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.625698 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.732903 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-k6btb" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.737582 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-d772s" Oct 07 13:15:06 crc kubenswrapper[4959]: I1007 13:15:06.858429 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:15:07 crc kubenswrapper[4959]: I1007 13:15:07.121928 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-scq9z" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="registry-server" containerID="cri-o://32ce376a46b60140db97aed54f305a24920afe8002fd752877ce472d9d81cea7" gracePeriod=2 Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.130350 4959 generic.go:334] "Generic (PLEG): container finished" podID="7445413c-7e73-465e-9eae-224ca4abed07" containerID="0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487" exitCode=0 Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.130477 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerDied","Data":"0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487"} Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.134976 4959 generic.go:334] "Generic (PLEG): container finished" podID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerID="32ce376a46b60140db97aed54f305a24920afe8002fd752877ce472d9d81cea7" exitCode=0 Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.135006 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerDied","Data":"32ce376a46b60140db97aed54f305a24920afe8002fd752877ce472d9d81cea7"} Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.775387 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.800803 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zg56\" (UniqueName: \"kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56\") pod \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.808374 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56" (OuterVolumeSpecName: "kube-api-access-7zg56") pod "e612f4ce-1e8d-4937-8d89-8424cfe7b066" (UID: "e612f4ce-1e8d-4937-8d89-8424cfe7b066"). InnerVolumeSpecName "kube-api-access-7zg56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.902845 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content\") pod \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.903514 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities\") pod \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\" (UID: \"e612f4ce-1e8d-4937-8d89-8424cfe7b066\") " Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.904677 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities" (OuterVolumeSpecName: "utilities") pod "e612f4ce-1e8d-4937-8d89-8424cfe7b066" (UID: "e612f4ce-1e8d-4937-8d89-8424cfe7b066"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.904932 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.904962 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zg56\" (UniqueName: \"kubernetes.io/projected/e612f4ce-1e8d-4937-8d89-8424cfe7b066-kube-api-access-7zg56\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.925968 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e612f4ce-1e8d-4937-8d89-8424cfe7b066" (UID: "e612f4ce-1e8d-4937-8d89-8424cfe7b066"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.977064 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:15:08 crc kubenswrapper[4959]: I1007 13:15:08.977550 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xdxcf" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="registry-server" containerID="cri-o://759f421227ac5555dc5835a9a89677e94fc274298c24c208241a2f79a2132ba9" gracePeriod=2 Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.006454 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e612f4ce-1e8d-4937-8d89-8424cfe7b066-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.151831 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scq9z" event={"ID":"e612f4ce-1e8d-4937-8d89-8424cfe7b066","Type":"ContainerDied","Data":"85058f8b0b13a9b3ef506234739efa962925575699c48c9b3dbe7a2ba2eeed1a"} Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.151890 4959 scope.go:117] "RemoveContainer" containerID="32ce376a46b60140db97aed54f305a24920afe8002fd752877ce472d9d81cea7" Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.151889 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scq9z" Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.154150 4959 generic.go:334] "Generic (PLEG): container finished" podID="aa5d4db1-1025-48c0-850d-54ac32c93f1f" containerID="04ba059da8bd75f5ff275d86fe67b2a955bbee0c57b2a3a920151b8480efc867" exitCode=0 Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.154180 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" event={"ID":"aa5d4db1-1025-48c0-850d-54ac32c93f1f","Type":"ContainerDied","Data":"04ba059da8bd75f5ff275d86fe67b2a955bbee0c57b2a3a920151b8480efc867"} Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.180823 4959 scope.go:117] "RemoveContainer" containerID="72b1bca13cd43e908f0a54217fb2844fa5e84b8b2c44aaf09b58213976e2e903" Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.205474 4959 scope.go:117] "RemoveContainer" containerID="a2e76c5193151e1c072d53649a037e84ec917965d706c7d27e06f0737979b6b5" Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.217991 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:15:09 crc kubenswrapper[4959]: I1007 13:15:09.227163 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-scq9z"] Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.173852 4959 generic.go:334] "Generic (PLEG): container finished" podID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerID="759f421227ac5555dc5835a9a89677e94fc274298c24c208241a2f79a2132ba9" exitCode=0 Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.173904 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerDied","Data":"759f421227ac5555dc5835a9a89677e94fc274298c24c208241a2f79a2132ba9"} Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.278084 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.329549 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content\") pod \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.329761 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7hr7\" (UniqueName: \"kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7\") pod \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.329820 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities\") pod \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\" (UID: \"72de1e9e-2526-42a4-bbd6-fc89237e75a7\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.331757 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities" (OuterVolumeSpecName: "utilities") pod "72de1e9e-2526-42a4-bbd6-fc89237e75a7" (UID: "72de1e9e-2526-42a4-bbd6-fc89237e75a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.336140 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7" (OuterVolumeSpecName: "kube-api-access-c7hr7") pod "72de1e9e-2526-42a4-bbd6-fc89237e75a7" (UID: "72de1e9e-2526-42a4-bbd6-fc89237e75a7"). InnerVolumeSpecName "kube-api-access-c7hr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.427388 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72de1e9e-2526-42a4-bbd6-fc89237e75a7" (UID: "72de1e9e-2526-42a4-bbd6-fc89237e75a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.432173 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7hr7\" (UniqueName: \"kubernetes.io/projected/72de1e9e-2526-42a4-bbd6-fc89237e75a7-kube-api-access-c7hr7\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.432249 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.432270 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72de1e9e-2526-42a4-bbd6-fc89237e75a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.458378 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.533582 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume\") pod \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.533617 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vvz6\" (UniqueName: \"kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6\") pod \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.533697 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume\") pod \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\" (UID: \"aa5d4db1-1025-48c0-850d-54ac32c93f1f\") " Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.535010 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "aa5d4db1-1025-48c0-850d-54ac32c93f1f" (UID: "aa5d4db1-1025-48c0-850d-54ac32c93f1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.538249 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aa5d4db1-1025-48c0-850d-54ac32c93f1f" (UID: "aa5d4db1-1025-48c0-850d-54ac32c93f1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.540070 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6" (OuterVolumeSpecName: "kube-api-access-8vvz6") pod "aa5d4db1-1025-48c0-850d-54ac32c93f1f" (UID: "aa5d4db1-1025-48c0-850d-54ac32c93f1f"). InnerVolumeSpecName "kube-api-access-8vvz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.634700 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa5d4db1-1025-48c0-850d-54ac32c93f1f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.634735 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vvz6\" (UniqueName: \"kubernetes.io/projected/aa5d4db1-1025-48c0-850d-54ac32c93f1f-kube-api-access-8vvz6\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.634744 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa5d4db1-1025-48c0-850d-54ac32c93f1f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:10 crc kubenswrapper[4959]: I1007 13:15:10.819125 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" path="/var/lib/kubelet/pods/e612f4ce-1e8d-4937-8d89-8424cfe7b066/volumes" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.189254 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxcf" event={"ID":"72de1e9e-2526-42a4-bbd6-fc89237e75a7","Type":"ContainerDied","Data":"8c3ac641394a373a5e09d1b344bbbc6ad3bf87e43cd2ed0457960f75fe25155b"} Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.189344 4959 scope.go:117] "RemoveContainer" containerID="759f421227ac5555dc5835a9a89677e94fc274298c24c208241a2f79a2132ba9" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.189505 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxcf" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.195821 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" event={"ID":"aa5d4db1-1025-48c0-850d-54ac32c93f1f","Type":"ContainerDied","Data":"0517de98b838be148c457be9235110b9d13dfb945a17f1267256bb5b46b085ac"} Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.195935 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.195967 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0517de98b838be148c457be9235110b9d13dfb945a17f1267256bb5b46b085ac" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.203162 4959 generic.go:334] "Generic (PLEG): container finished" podID="7445413c-7e73-465e-9eae-224ca4abed07" containerID="f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe" exitCode=0 Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.203242 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerDied","Data":"f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe"} Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.218159 4959 scope.go:117] "RemoveContainer" containerID="e63945291aa27df10571b1e69fe74c94ae383a28921efeea4a2fefc7d400c590" Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.225259 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.239621 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xdxcf"] Oct 07 13:15:11 crc kubenswrapper[4959]: I1007 13:15:11.269798 4959 scope.go:117] "RemoveContainer" containerID="a8ae1839419c535402300ddf83cfc3f2557c3c46fbdf201020b79f3f64ef97d8" Oct 07 13:15:12 crc kubenswrapper[4959]: I1007 13:15:12.210594 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerStarted","Data":"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c"} Oct 07 13:15:12 crc kubenswrapper[4959]: I1007 13:15:12.234059 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lqlrs" podStartSLOduration=9.528832995 podStartE2EDuration="13.234040929s" podCreationTimestamp="2025-10-07 13:14:59 +0000 UTC" firstStartedPulling="2025-10-07 13:15:08.134989513 +0000 UTC m=+860.295712190" lastFinishedPulling="2025-10-07 13:15:11.840197447 +0000 UTC m=+864.000920124" observedRunningTime="2025-10-07 13:15:12.232221176 +0000 UTC m=+864.392943903" watchObservedRunningTime="2025-10-07 13:15:12.234040929 +0000 UTC m=+864.394763626" Oct 07 13:15:12 crc kubenswrapper[4959]: I1007 13:15:12.819907 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" path="/var/lib/kubelet/pods/72de1e9e-2526-42a4-bbd6-fc89237e75a7/volumes" Oct 07 13:15:19 crc kubenswrapper[4959]: I1007 13:15:19.554801 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:19 crc kubenswrapper[4959]: I1007 13:15:19.556029 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:19 crc kubenswrapper[4959]: I1007 13:15:19.595947 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:20 crc kubenswrapper[4959]: I1007 13:15:20.326973 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:20 crc kubenswrapper[4959]: I1007 13:15:20.378527 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:15:21 crc kubenswrapper[4959]: I1007 13:15:21.177994 4959 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","poda179b1bb-17fb-4ba8-80f9-0741c9b49c04"] err="unable to destroy cgroup paths for cgroup [kubepods burstable poda179b1bb-17fb-4ba8-80f9-0741c9b49c04] : Timed out while waiting for systemd to remove kubepods-burstable-poda179b1bb_17fb_4ba8_80f9_0741c9b49c04.slice" Oct 07 13:15:21 crc kubenswrapper[4959]: E1007 13:15:21.179265 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable poda179b1bb-17fb-4ba8-80f9-0741c9b49c04] : unable to destroy cgroup paths for cgroup [kubepods burstable poda179b1bb-17fb-4ba8-80f9-0741c9b49c04] : Timed out while waiting for systemd to remove kubepods-burstable-poda179b1bb_17fb_4ba8_80f9_0741c9b49c04.slice" pod="openshift-marketplace/certified-operators-vgd79" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" Oct 07 13:15:21 crc kubenswrapper[4959]: I1007 13:15:21.280453 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgd79" Oct 07 13:15:21 crc kubenswrapper[4959]: I1007 13:15:21.302195 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:15:21 crc kubenswrapper[4959]: I1007 13:15:21.308394 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgd79"] Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.286313 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lqlrs" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="registry-server" containerID="cri-o://c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c" gracePeriod=2 Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.716300 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721021 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721055 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721075 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721083 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721092 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d4db1-1025-48c0-850d-54ac32c93f1f" containerName="collect-profiles" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721101 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d4db1-1025-48c0-850d-54ac32c93f1f" containerName="collect-profiles" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721127 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721135 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721153 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721160 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721175 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721186 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.721194 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721200 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721397 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e612f4ce-1e8d-4937-8d89-8424cfe7b066" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721426 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="72de1e9e-2526-42a4-bbd6-fc89237e75a7" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.721438 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5d4db1-1025-48c0-850d-54ac32c93f1f" containerName="collect-profiles" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.722405 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.724150 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.724553 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wdd9r" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.724809 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.726015 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.726861 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.767693 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.776201 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.776577 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.776601 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.776612 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.776620 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="extract-utilities" Oct 07 13:15:22 crc kubenswrapper[4959]: E1007 13:15:22.776678 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.776687 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="extract-content" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.776845 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7445413c-7e73-465e-9eae-224ca4abed07" containerName="registry-server" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.777745 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.781043 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.793781 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.805363 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content\") pod \"7445413c-7e73-465e-9eae-224ca4abed07\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807048 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g22vn\" (UniqueName: \"kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn\") pod \"7445413c-7e73-465e-9eae-224ca4abed07\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807103 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities\") pod \"7445413c-7e73-465e-9eae-224ca4abed07\" (UID: \"7445413c-7e73-465e-9eae-224ca4abed07\") " Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807371 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807398 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807426 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807481 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6sjb\" (UniqueName: \"kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.807513 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.810393 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities" (OuterVolumeSpecName: "utilities") pod "7445413c-7e73-465e-9eae-224ca4abed07" (UID: "7445413c-7e73-465e-9eae-224ca4abed07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.827092 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a179b1bb-17fb-4ba8-80f9-0741c9b49c04" path="/var/lib/kubelet/pods/a179b1bb-17fb-4ba8-80f9-0741c9b49c04/volumes" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.836574 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn" (OuterVolumeSpecName: "kube-api-access-g22vn") pod "7445413c-7e73-465e-9eae-224ca4abed07" (UID: "7445413c-7e73-465e-9eae-224ca4abed07"). InnerVolumeSpecName "kube-api-access-g22vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.872767 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7445413c-7e73-465e-9eae-224ca4abed07" (UID: "7445413c-7e73-465e-9eae-224ca4abed07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.907946 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6sjb\" (UniqueName: \"kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.907991 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908036 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908054 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908077 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908127 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908137 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g22vn\" (UniqueName: \"kubernetes.io/projected/7445413c-7e73-465e-9eae-224ca4abed07-kube-api-access-g22vn\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.908146 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7445413c-7e73-465e-9eae-224ca4abed07-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.909070 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.909072 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.909152 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.932370 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6\") pod \"dnsmasq-dns-7bfcb9d745-4qr99\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:22 crc kubenswrapper[4959]: I1007 13:15:22.932386 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6sjb\" (UniqueName: \"kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb\") pod \"dnsmasq-dns-758b79db4c-wp7jq\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.088771 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.101306 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.296858 4959 generic.go:334] "Generic (PLEG): container finished" podID="7445413c-7e73-465e-9eae-224ca4abed07" containerID="c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c" exitCode=0 Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.296900 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerDied","Data":"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c"} Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.296931 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqlrs" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.296957 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqlrs" event={"ID":"7445413c-7e73-465e-9eae-224ca4abed07","Type":"ContainerDied","Data":"7f48c35382b1aab2326b7f6cfce8ed600b09b40cf0594c4e736b29f3d29be139"} Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.296977 4959 scope.go:117] "RemoveContainer" containerID="c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.311189 4959 scope.go:117] "RemoveContainer" containerID="f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.331006 4959 scope.go:117] "RemoveContainer" containerID="0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.334896 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.339668 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lqlrs"] Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.365824 4959 scope.go:117] "RemoveContainer" containerID="c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c" Oct 07 13:15:23 crc kubenswrapper[4959]: E1007 13:15:23.377012 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c\": container with ID starting with c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c not found: ID does not exist" containerID="c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.377051 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c"} err="failed to get container status \"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c\": rpc error: code = NotFound desc = could not find container \"c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c\": container with ID starting with c452316c729f5778ebf84417d47a1817c285ac5326ebf04ef22dc1080a9abf1c not found: ID does not exist" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.377075 4959 scope.go:117] "RemoveContainer" containerID="f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe" Oct 07 13:15:23 crc kubenswrapper[4959]: E1007 13:15:23.377427 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe\": container with ID starting with f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe not found: ID does not exist" containerID="f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.377455 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe"} err="failed to get container status \"f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe\": rpc error: code = NotFound desc = could not find container \"f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe\": container with ID starting with f0433e13532935a17047577cf1c191635db6d774b30647cc084236e3f30f2abe not found: ID does not exist" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.377474 4959 scope.go:117] "RemoveContainer" containerID="0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487" Oct 07 13:15:23 crc kubenswrapper[4959]: E1007 13:15:23.377895 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487\": container with ID starting with 0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487 not found: ID does not exist" containerID="0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.377926 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487"} err="failed to get container status \"0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487\": rpc error: code = NotFound desc = could not find container \"0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487\": container with ID starting with 0a002927234bbdf1cf79b47cb4377c471ecf8da4942166b635264c643a131487 not found: ID does not exist" Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.504960 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:23 crc kubenswrapper[4959]: I1007 13:15:23.571234 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:24 crc kubenswrapper[4959]: I1007 13:15:24.305419 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" event={"ID":"b5a26cf8-2e0e-4e47-9fa5-66407d21539a","Type":"ContainerStarted","Data":"f44fac577fba2af604a250bc8ff593991d61d80c570ce91663924638d4b8dd20"} Oct 07 13:15:24 crc kubenswrapper[4959]: I1007 13:15:24.308197 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" event={"ID":"c0866e59-1ee2-4b76-a976-e8e1a7788bf4","Type":"ContainerStarted","Data":"75542f9d2395b9ecb2f76e5a93de8a4222cf71175880b62e87517b9e3f7ef286"} Oct 07 13:15:24 crc kubenswrapper[4959]: I1007 13:15:24.820531 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7445413c-7e73-465e-9eae-224ca4abed07" path="/var/lib/kubelet/pods/7445413c-7e73-465e-9eae-224ca4abed07/volumes" Oct 07 13:15:25 crc kubenswrapper[4959]: I1007 13:15:25.924933 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:25 crc kubenswrapper[4959]: I1007 13:15:25.952278 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:25 crc kubenswrapper[4959]: I1007 13:15:25.953686 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:25 crc kubenswrapper[4959]: I1007 13:15:25.979119 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.082830 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmbw\" (UniqueName: \"kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.082906 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.083003 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.183744 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.183802 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmbw\" (UniqueName: \"kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.183830 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.184662 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.184881 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.214602 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmbw\" (UniqueName: \"kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw\") pod \"dnsmasq-dns-8575fc99d7-xc6jp\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.275912 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.298471 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.324807 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.326313 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.334366 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.387706 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqfqv\" (UniqueName: \"kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.387755 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.387782 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.489058 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqfqv\" (UniqueName: \"kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.489114 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.489136 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.490575 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.494800 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.524971 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqfqv\" (UniqueName: \"kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv\") pod \"dnsmasq-dns-77597f887-h8vzm\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:26 crc kubenswrapper[4959]: I1007 13:15:26.653136 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.153387 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.156224 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159005 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159129 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159308 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159420 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159613 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fc4nq" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159750 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.159869 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.179012 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301575 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301644 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301715 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301743 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301838 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.301965 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.302024 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.302060 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.302126 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.302191 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.302239 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404126 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404190 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404216 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404266 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404284 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404312 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404331 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404347 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404379 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404406 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404427 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.404523 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.405039 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.405086 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.406640 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.406890 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.409997 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.410739 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.411013 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.412334 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.412939 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.422460 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.445301 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.452273 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.453669 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461153 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461350 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461411 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461521 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461715 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.461993 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d9h7w" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.462831 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.481418 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.483299 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609497 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609546 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609582 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609612 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609642 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609679 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609704 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609721 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzjr8\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609829 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609882 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.609990 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.711574 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.711637 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.711655 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzjr8\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.711686 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.712539 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.711825 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.712486 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715035 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715068 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715088 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715121 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715150 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715165 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.715862 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.716308 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.716826 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.716982 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.717162 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.718426 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.719811 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.729238 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzjr8\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.729700 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.731407 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " pod="openstack/rabbitmq-server-0" Oct 07 13:15:27 crc kubenswrapper[4959]: I1007 13:15:27.807937 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.869335 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.870994 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.873217 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.873493 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.873775 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mczlh" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.878206 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.878346 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.884259 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:15:29 crc kubenswrapper[4959]: I1007 13:15:29.900344 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.066665 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067108 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067204 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067297 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-secrets\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067353 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067424 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067489 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067565 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7q6z\" (UniqueName: \"kubernetes.io/projected/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kube-api-access-q7q6z\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.067680 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.169288 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.169368 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.169445 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.169474 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.169726 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.170364 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171104 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171266 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-secrets\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171665 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171726 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171763 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.171804 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7q6z\" (UniqueName: \"kubernetes.io/projected/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kube-api-access-q7q6z\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.172017 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e980567-4b6d-474f-ae89-3dc436ebf1a5-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.172247 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.179357 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-secrets\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.180969 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.186854 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e980567-4b6d-474f-ae89-3dc436ebf1a5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.197612 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7q6z\" (UniqueName: \"kubernetes.io/projected/5e980567-4b6d-474f-ae89-3dc436ebf1a5-kube-api-access-q7q6z\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.201403 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5e980567-4b6d-474f-ae89-3dc436ebf1a5\") " pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.284294 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.285759 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.289204 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.290438 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.290649 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-x4wpx" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.290892 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.299413 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475141 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475349 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475537 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475590 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475651 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bchk\" (UniqueName: \"kubernetes.io/projected/fed91ea6-e906-47c4-84e0-123c01a9780d-kube-api-access-6bchk\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475699 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475773 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475855 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.475907 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.499062 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577566 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577607 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577693 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bchk\" (UniqueName: \"kubernetes.io/projected/fed91ea6-e906-47c4-84e0-123c01a9780d-kube-api-access-6bchk\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577722 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577756 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577780 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577809 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577829 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.577854 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.578317 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.579088 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.581189 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.581941 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.582070 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.583654 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.584081 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed91ea6-e906-47c4-84e0-123c01a9780d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.585415 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fed91ea6-e906-47c4-84e0-123c01a9780d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.599902 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.614742 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bchk\" (UniqueName: \"kubernetes.io/projected/fed91ea6-e906-47c4-84e0-123c01a9780d-kube-api-access-6bchk\") pod \"openstack-cell1-galera-0\" (UID: \"fed91ea6-e906-47c4-84e0-123c01a9780d\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.624380 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.640818 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.642411 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.648968 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.649215 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.649844 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qjtjx" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.653478 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.780442 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnts\" (UniqueName: \"kubernetes.io/projected/72f6396e-c1ff-485b-8878-33f9ab5dc874-kube-api-access-dgnts\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.780504 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-kolla-config\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.780580 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-config-data\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.780599 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.780614 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.882444 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-config-data\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.882496 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.882518 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.882559 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnts\" (UniqueName: \"kubernetes.io/projected/72f6396e-c1ff-485b-8878-33f9ab5dc874-kube-api-access-dgnts\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.882590 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-kolla-config\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.883669 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-config-data\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.883735 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f6396e-c1ff-485b-8878-33f9ab5dc874-kolla-config\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.888648 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.889567 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f6396e-c1ff-485b-8878-33f9ab5dc874-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.902399 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnts\" (UniqueName: \"kubernetes.io/projected/72f6396e-c1ff-485b-8878-33f9ab5dc874-kube-api-access-dgnts\") pod \"memcached-0\" (UID: \"72f6396e-c1ff-485b-8878-33f9ab5dc874\") " pod="openstack/memcached-0" Oct 07 13:15:30 crc kubenswrapper[4959]: I1007 13:15:30.979939 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.459572 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.461884 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.465682 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bg76x" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.468309 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.534569 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lvl\" (UniqueName: \"kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl\") pod \"kube-state-metrics-0\" (UID: \"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61\") " pod="openstack/kube-state-metrics-0" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.636649 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lvl\" (UniqueName: \"kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl\") pod \"kube-state-metrics-0\" (UID: \"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61\") " pod="openstack/kube-state-metrics-0" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.671750 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lvl\" (UniqueName: \"kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl\") pod \"kube-state-metrics-0\" (UID: \"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61\") " pod="openstack/kube-state-metrics-0" Oct 07 13:15:32 crc kubenswrapper[4959]: I1007 13:15:32.779069 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.061239 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-z8f9v"] Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.064343 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.066606 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mkhq5" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.069353 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.069422 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.080315 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z8f9v"] Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.121908 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4nl8g"] Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.124571 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.140042 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4nl8g"] Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207488 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/907772e5-2f0c-4478-9d3b-8f82eec8f258-scripts\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207568 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-log-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207604 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207646 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-ovn-controller-tls-certs\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207677 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207703 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq7h\" (UniqueName: \"kubernetes.io/projected/907772e5-2f0c-4478-9d3b-8f82eec8f258-kube-api-access-qcq7h\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.207742 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-combined-ca-bundle\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309245 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-combined-ca-bundle\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309300 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-log\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309326 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/907772e5-2f0c-4478-9d3b-8f82eec8f258-scripts\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309346 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-lib\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309389 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba866f69-2f83-4b66-b1af-693f07c437e0-scripts\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309414 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-log-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309430 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-run\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309458 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9t6\" (UniqueName: \"kubernetes.io/projected/ba866f69-2f83-4b66-b1af-693f07c437e0-kube-api-access-mh9t6\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309484 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309505 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-ovn-controller-tls-certs\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309523 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309538 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-etc-ovs\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309564 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq7h\" (UniqueName: \"kubernetes.io/projected/907772e5-2f0c-4478-9d3b-8f82eec8f258-kube-api-access-qcq7h\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.309963 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-log-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.310198 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run-ovn\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.310246 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/907772e5-2f0c-4478-9d3b-8f82eec8f258-var-run\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.311327 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/907772e5-2f0c-4478-9d3b-8f82eec8f258-scripts\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.316940 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-combined-ca-bundle\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.316963 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/907772e5-2f0c-4478-9d3b-8f82eec8f258-ovn-controller-tls-certs\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.330959 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq7h\" (UniqueName: \"kubernetes.io/projected/907772e5-2f0c-4478-9d3b-8f82eec8f258-kube-api-access-qcq7h\") pod \"ovn-controller-z8f9v\" (UID: \"907772e5-2f0c-4478-9d3b-8f82eec8f258\") " pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.396234 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411376 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-log\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-lib\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411496 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba866f69-2f83-4b66-b1af-693f07c437e0-scripts\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411533 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-run\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411570 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9t6\" (UniqueName: \"kubernetes.io/projected/ba866f69-2f83-4b66-b1af-693f07c437e0-kube-api-access-mh9t6\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411613 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-etc-ovs\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411898 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-log\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.411959 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-etc-ovs\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.412109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-lib\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.412170 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba866f69-2f83-4b66-b1af-693f07c437e0-var-run\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.413801 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba866f69-2f83-4b66-b1af-693f07c437e0-scripts\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.433585 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9t6\" (UniqueName: \"kubernetes.io/projected/ba866f69-2f83-4b66-b1af-693f07c437e0-kube-api-access-mh9t6\") pod \"ovn-controller-ovs-4nl8g\" (UID: \"ba866f69-2f83-4b66-b1af-693f07c437e0\") " pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.444554 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.973002 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.974935 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.978742 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.980091 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.980399 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.980657 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bvwx8" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.980953 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 13:15:36 crc kubenswrapper[4959]: I1007 13:15:36.992442 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142120 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142231 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-config\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142262 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142302 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142328 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142360 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf4k\" (UniqueName: \"kubernetes.io/projected/151d32f4-496d-43a0-aeb7-ee999d5faeef-kube-api-access-vcf4k\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142517 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.142639 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.243823 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-config\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.243893 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.243946 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.243980 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.244019 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf4k\" (UniqueName: \"kubernetes.io/projected/151d32f4-496d-43a0-aeb7-ee999d5faeef-kube-api-access-vcf4k\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.244065 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.244102 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.244143 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.245686 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.245728 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-config\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.247001 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.247727 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/151d32f4-496d-43a0-aeb7-ee999d5faeef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.249667 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.250006 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.252619 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151d32f4-496d-43a0-aeb7-ee999d5faeef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.265441 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf4k\" (UniqueName: \"kubernetes.io/projected/151d32f4-496d-43a0-aeb7-ee999d5faeef-kube-api-access-vcf4k\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.267743 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"151d32f4-496d-43a0-aeb7-ee999d5faeef\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.349664 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.934200 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:15:37 crc kubenswrapper[4959]: I1007 13:15:37.968545 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:15:38 crc kubenswrapper[4959]: W1007 13:15:38.426018 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8703d817_5027_4394_a52d_a895f7e0fd10.slice/crio-b9b277bd9b1c3beea429c40e125fcf1770c814f29e6c219aa891fbc2161905d5 WatchSource:0}: Error finding container b9b277bd9b1c3beea429c40e125fcf1770c814f29e6c219aa891fbc2161905d5: Status 404 returned error can't find the container with id b9b277bd9b1c3beea429c40e125fcf1770c814f29e6c219aa891fbc2161905d5 Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.426194 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.426353 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6sjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-758b79db4c-wp7jq_openstack(c0866e59-1ee2-4b76-a976-e8e1a7788bf4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.427436 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" podUID="c0866e59-1ee2-4b76-a976-e8e1a7788bf4" Oct 07 13:15:38 crc kubenswrapper[4959]: W1007 13:15:38.427603 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb523e83_ee5e_42fd_acb8_4d22edd64e3d.slice/crio-2509851f29d37a2b98bf0a1e2d63d8df194d0cd043dfdc0d0e0bfe56407fa9be WatchSource:0}: Error finding container 2509851f29d37a2b98bf0a1e2d63d8df194d0cd043dfdc0d0e0bfe56407fa9be: Status 404 returned error can't find the container with id 2509851f29d37a2b98bf0a1e2d63d8df194d0cd043dfdc0d0e0bfe56407fa9be Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.444509 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.444669 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qsj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-4qr99_openstack(b5a26cf8-2e0e-4e47-9fa5-66407d21539a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 13:15:38 crc kubenswrapper[4959]: E1007 13:15:38.445900 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" podUID="b5a26cf8-2e0e-4e47-9fa5-66407d21539a" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.082814 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.084640 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z8f9v"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.102355 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.227021 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: W1007 13:15:39.229940 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b4bc70e_cbbe_4c3f_a096_f2b1b0d89e61.slice/crio-f6b716966a4427083618248f96c372f67c8cdc1f1111a8d4c71c5261285ffbec WatchSource:0}: Error finding container f6b716966a4427083618248f96c372f67c8cdc1f1111a8d4c71c5261285ffbec: Status 404 returned error can't find the container with id f6b716966a4427083618248f96c372f67c8cdc1f1111a8d4c71c5261285ffbec Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.242989 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.249763 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.267659 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: W1007 13:15:39.282332 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e980567_4b6d_474f_ae89_3dc436ebf1a5.slice/crio-bc8780cf563b75f7f7b3ae17a13d0f6be30ece860c5663ff89dcd3c42709db5a WatchSource:0}: Error finding container bc8780cf563b75f7f7b3ae17a13d0f6be30ece860c5663ff89dcd3c42709db5a: Status 404 returned error can't find the container with id bc8780cf563b75f7f7b3ae17a13d0f6be30ece860c5663ff89dcd3c42709db5a Oct 07 13:15:39 crc kubenswrapper[4959]: W1007 13:15:39.282557 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a23cdde_db3b_403e_8c39_1ed3b6c6c808.slice/crio-dbe8f26224cd69bad42fc282f078b43dd9ba8322e47e043164f2628649ff015e WatchSource:0}: Error finding container dbe8f26224cd69bad42fc282f078b43dd9ba8322e47e043164f2628649ff015e: Status 404 returned error can't find the container with id dbe8f26224cd69bad42fc282f078b43dd9ba8322e47e043164f2628649ff015e Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.323374 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: W1007 13:15:39.347673 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151d32f4_496d_43a0_aeb7_ee999d5faeef.slice/crio-ec9b439f7c941fb5dbe075c0e32d8910d7d7629871eaa095a9b41646a9b5b0a0 WatchSource:0}: Error finding container ec9b439f7c941fb5dbe075c0e32d8910d7d7629871eaa095a9b41646a9b5b0a0: Status 404 returned error can't find the container with id ec9b439f7c941fb5dbe075c0e32d8910d7d7629871eaa095a9b41646a9b5b0a0 Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.434473 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fed91ea6-e906-47c4-84e0-123c01a9780d","Type":"ContainerStarted","Data":"c843c6218a619f5a02241a95d0637a5523f49f3ae307869ea82516fc8002310c"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.436984 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"151d32f4-496d-43a0-aeb7-ee999d5faeef","Type":"ContainerStarted","Data":"ec9b439f7c941fb5dbe075c0e32d8910d7d7629871eaa095a9b41646a9b5b0a0"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.439039 4959 generic.go:334] "Generic (PLEG): container finished" podID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerID="3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954" exitCode=0 Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.439117 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-h8vzm" event={"ID":"db523e83-ee5e-42fd-acb8-4d22edd64e3d","Type":"ContainerDied","Data":"3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.439175 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-h8vzm" event={"ID":"db523e83-ee5e-42fd-acb8-4d22edd64e3d","Type":"ContainerStarted","Data":"2509851f29d37a2b98bf0a1e2d63d8df194d0cd043dfdc0d0e0bfe56407fa9be"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.440868 4959 generic.go:334] "Generic (PLEG): container finished" podID="fe087602-e770-412d-9795-9dc51a94f267" containerID="b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7" exitCode=0 Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.440945 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" event={"ID":"fe087602-e770-412d-9795-9dc51a94f267","Type":"ContainerDied","Data":"b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.440976 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" event={"ID":"fe087602-e770-412d-9795-9dc51a94f267","Type":"ContainerStarted","Data":"68afee6a7463174ca65169f463459070b4cdff96f7cd457bc82b23ab30b391e4"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.443249 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z8f9v" event={"ID":"907772e5-2f0c-4478-9d3b-8f82eec8f258","Type":"ContainerStarted","Data":"cf131848e76e237c68d56a861c48c7f87711f43dead37873959efaaa3bdbac65"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.445985 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerStarted","Data":"b9b277bd9b1c3beea429c40e125fcf1770c814f29e6c219aa891fbc2161905d5"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.447455 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"72f6396e-c1ff-485b-8878-33f9ab5dc874","Type":"ContainerStarted","Data":"5e877e4ebff9288d0552d27183226084fba7d21369c3bce6f7be521456bb318f"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.450097 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e980567-4b6d-474f-ae89-3dc436ebf1a5","Type":"ContainerStarted","Data":"bc8780cf563b75f7f7b3ae17a13d0f6be30ece860c5663ff89dcd3c42709db5a"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.452359 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerStarted","Data":"dbe8f26224cd69bad42fc282f078b43dd9ba8322e47e043164f2628649ff015e"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.453963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61","Type":"ContainerStarted","Data":"f6b716966a4427083618248f96c372f67c8cdc1f1111a8d4c71c5261285ffbec"} Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.599422 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.601289 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.605074 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.609407 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rdbgs" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.609683 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.610136 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.610287 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.653790 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4nl8g"] Oct 07 13:15:39 crc kubenswrapper[4959]: W1007 13:15:39.665753 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba866f69_2f83_4b66_b1af_693f07c437e0.slice/crio-57e69e883110fd5c7e256d8ec7da4ae283d8db276d487511155d8c0e3c9d41e5 WatchSource:0}: Error finding container 57e69e883110fd5c7e256d8ec7da4ae283d8db276d487511155d8c0e3c9d41e5: Status 404 returned error can't find the container with id 57e69e883110fd5c7e256d8ec7da4ae283d8db276d487511155d8c0e3c9d41e5 Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707704 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9sh6\" (UniqueName: \"kubernetes.io/projected/26d915bf-8d27-4349-9a3b-f13f13809cf5-kube-api-access-l9sh6\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707748 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707786 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707814 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707862 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707920 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.707942 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: E1007 13:15:39.789835 4959 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 07 13:15:39 crc kubenswrapper[4959]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fe087602-e770-412d-9795-9dc51a94f267/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 13:15:39 crc kubenswrapper[4959]: > podSandboxID="68afee6a7463174ca65169f463459070b4cdff96f7cd457bc82b23ab30b391e4" Oct 07 13:15:39 crc kubenswrapper[4959]: E1007 13:15:39.790708 4959 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 13:15:39 crc kubenswrapper[4959]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hmbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8575fc99d7-xc6jp_openstack(fe087602-e770-412d-9795-9dc51a94f267): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fe087602-e770-412d-9795-9dc51a94f267/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 13:15:39 crc kubenswrapper[4959]: > logger="UnhandledError" Oct 07 13:15:39 crc kubenswrapper[4959]: E1007 13:15:39.791818 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fe087602-e770-412d-9795-9dc51a94f267/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" podUID="fe087602-e770-412d-9795-9dc51a94f267" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810169 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810307 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810417 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810501 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810593 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9sh6\" (UniqueName: \"kubernetes.io/projected/26d915bf-8d27-4349-9a3b-f13f13809cf5-kube-api-access-l9sh6\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810723 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810850 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.810931 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.811664 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.812057 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d915bf-8d27-4349-9a3b-f13f13809cf5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.812352 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.813193 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.817054 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.817087 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.823034 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d915bf-8d27-4349-9a3b-f13f13809cf5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.836555 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9sh6\" (UniqueName: \"kubernetes.io/projected/26d915bf-8d27-4349-9a3b-f13f13809cf5-kube-api-access-l9sh6\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.846422 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d915bf-8d27-4349-9a3b-f13f13809cf5\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.884603 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.926486 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:39 crc kubenswrapper[4959]: I1007 13:15:39.959934 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.014890 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config\") pod \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.014951 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc\") pod \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.014975 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config\") pod \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.015069 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6\") pod \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\" (UID: \"b5a26cf8-2e0e-4e47-9fa5-66407d21539a\") " Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.015089 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6sjb\" (UniqueName: \"kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb\") pod \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\" (UID: \"c0866e59-1ee2-4b76-a976-e8e1a7788bf4\") " Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.016317 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config" (OuterVolumeSpecName: "config") pod "b5a26cf8-2e0e-4e47-9fa5-66407d21539a" (UID: "b5a26cf8-2e0e-4e47-9fa5-66407d21539a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.016313 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config" (OuterVolumeSpecName: "config") pod "c0866e59-1ee2-4b76-a976-e8e1a7788bf4" (UID: "c0866e59-1ee2-4b76-a976-e8e1a7788bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.016345 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0866e59-1ee2-4b76-a976-e8e1a7788bf4" (UID: "c0866e59-1ee2-4b76-a976-e8e1a7788bf4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.021404 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb" (OuterVolumeSpecName: "kube-api-access-k6sjb") pod "c0866e59-1ee2-4b76-a976-e8e1a7788bf4" (UID: "c0866e59-1ee2-4b76-a976-e8e1a7788bf4"). InnerVolumeSpecName "kube-api-access-k6sjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.021751 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6" (OuterVolumeSpecName: "kube-api-access-4qsj6") pod "b5a26cf8-2e0e-4e47-9fa5-66407d21539a" (UID: "b5a26cf8-2e0e-4e47-9fa5-66407d21539a"). InnerVolumeSpecName "kube-api-access-4qsj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.117904 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.118295 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/b5a26cf8-2e0e-4e47-9fa5-66407d21539a-kube-api-access-4qsj6\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.118313 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6sjb\" (UniqueName: \"kubernetes.io/projected/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-kube-api-access-k6sjb\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.118326 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.118358 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0866e59-1ee2-4b76-a976-e8e1a7788bf4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.374773 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:15:40 crc kubenswrapper[4959]: W1007 13:15:40.412797 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d915bf_8d27_4349_9a3b_f13f13809cf5.slice/crio-96fc887f22412907607785840d1c72899e7b18b4e1b8352fa1298d88d5bf8536 WatchSource:0}: Error finding container 96fc887f22412907607785840d1c72899e7b18b4e1b8352fa1298d88d5bf8536: Status 404 returned error can't find the container with id 96fc887f22412907607785840d1c72899e7b18b4e1b8352fa1298d88d5bf8536 Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.464477 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4nl8g" event={"ID":"ba866f69-2f83-4b66-b1af-693f07c437e0","Type":"ContainerStarted","Data":"57e69e883110fd5c7e256d8ec7da4ae283d8db276d487511155d8c0e3c9d41e5"} Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.465851 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"26d915bf-8d27-4349-9a3b-f13f13809cf5","Type":"ContainerStarted","Data":"96fc887f22412907607785840d1c72899e7b18b4e1b8352fa1298d88d5bf8536"} Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.467417 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" event={"ID":"c0866e59-1ee2-4b76-a976-e8e1a7788bf4","Type":"ContainerDied","Data":"75542f9d2395b9ecb2f76e5a93de8a4222cf71175880b62e87517b9e3f7ef286"} Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.467728 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-wp7jq" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.470916 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-h8vzm" event={"ID":"db523e83-ee5e-42fd-acb8-4d22edd64e3d","Type":"ContainerStarted","Data":"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136"} Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.471067 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.472139 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" event={"ID":"b5a26cf8-2e0e-4e47-9fa5-66407d21539a","Type":"ContainerDied","Data":"f44fac577fba2af604a250bc8ff593991d61d80c570ce91663924638d4b8dd20"} Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.472164 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-4qr99" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.490842 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77597f887-h8vzm" podStartSLOduration=13.808562794 podStartE2EDuration="14.490819429s" podCreationTimestamp="2025-10-07 13:15:26 +0000 UTC" firstStartedPulling="2025-10-07 13:15:38.431067742 +0000 UTC m=+890.591790419" lastFinishedPulling="2025-10-07 13:15:39.113324377 +0000 UTC m=+891.274047054" observedRunningTime="2025-10-07 13:15:40.485067834 +0000 UTC m=+892.645790531" watchObservedRunningTime="2025-10-07 13:15:40.490819429 +0000 UTC m=+892.651542106" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.632331 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.642298 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-4qr99"] Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.666344 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.671508 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-wp7jq"] Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.821300 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a26cf8-2e0e-4e47-9fa5-66407d21539a" path="/var/lib/kubelet/pods/b5a26cf8-2e0e-4e47-9fa5-66407d21539a/volumes" Oct 07 13:15:40 crc kubenswrapper[4959]: I1007 13:15:40.821674 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0866e59-1ee2-4b76-a976-e8e1a7788bf4" path="/var/lib/kubelet/pods/c0866e59-1ee2-4b76-a976-e8e1a7788bf4/volumes" Oct 07 13:15:46 crc kubenswrapper[4959]: I1007 13:15:46.654690 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:15:46 crc kubenswrapper[4959]: I1007 13:15:46.709026 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:50 crc kubenswrapper[4959]: I1007 13:15:50.908767 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.574336 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerStarted","Data":"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.588406 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"72f6396e-c1ff-485b-8878-33f9ab5dc874","Type":"ContainerStarted","Data":"0d8acba933908e0d5f3ed237aba7a68d2e3fefa1a16a33d12667fe3c339f22df"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.588510 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.590537 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4nl8g" event={"ID":"ba866f69-2f83-4b66-b1af-693f07c437e0","Type":"ContainerStarted","Data":"258bf1697930f1b9d261b529af27ced5132a73a27da717ecc0808556cc89dbf2"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.592339 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"151d32f4-496d-43a0-aeb7-ee999d5faeef","Type":"ContainerStarted","Data":"ee79e15457bd8d2296e469e045405b0bee31020956fdc9c69a7d172cc07c999d"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.596989 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z8f9v" event={"ID":"907772e5-2f0c-4478-9d3b-8f82eec8f258","Type":"ContainerStarted","Data":"22c6ef0a80598ff89454aac87d1a09c21bbe039bb6f5ca8a92686e190ac85a49"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.597841 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-z8f9v" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.599918 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61","Type":"ContainerStarted","Data":"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.600074 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.601714 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"26d915bf-8d27-4349-9a3b-f13f13809cf5","Type":"ContainerStarted","Data":"a01bd2667c0b4750850338cea7ebdb28c09b0b255ae17c5f77f41e4c8a7ab462"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.603712 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" event={"ID":"fe087602-e770-412d-9795-9dc51a94f267","Type":"ContainerStarted","Data":"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.603852 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="dnsmasq-dns" containerID="cri-o://bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548" gracePeriod=10 Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.604279 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.610963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fed91ea6-e906-47c4-84e0-123c01a9780d","Type":"ContainerStarted","Data":"6470c8b80bb91126aa97401fcb8cec84d46074393bbd43d25031bc8276804f32"} Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.636420 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" podStartSLOduration=26.636402713 podStartE2EDuration="26.636402713s" podCreationTimestamp="2025-10-07 13:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:15:51.6328355 +0000 UTC m=+903.793558177" watchObservedRunningTime="2025-10-07 13:15:51.636402713 +0000 UTC m=+903.797125390" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.675322 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-z8f9v" podStartSLOduration=4.883039733 podStartE2EDuration="15.675305014s" podCreationTimestamp="2025-10-07 13:15:36 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.103669849 +0000 UTC m=+891.264392526" lastFinishedPulling="2025-10-07 13:15:49.89593513 +0000 UTC m=+902.056657807" observedRunningTime="2025-10-07 13:15:51.668247621 +0000 UTC m=+903.828970298" watchObservedRunningTime="2025-10-07 13:15:51.675305014 +0000 UTC m=+903.836027691" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.684971 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.34060319 podStartE2EDuration="19.684950032s" podCreationTimestamp="2025-10-07 13:15:32 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.231669898 +0000 UTC m=+891.392392575" lastFinishedPulling="2025-10-07 13:15:50.5760167 +0000 UTC m=+902.736739417" observedRunningTime="2025-10-07 13:15:51.683298064 +0000 UTC m=+903.844020741" watchObservedRunningTime="2025-10-07 13:15:51.684950032 +0000 UTC m=+903.845672709" Oct 07 13:15:51 crc kubenswrapper[4959]: I1007 13:15:51.711889 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.251906934 podStartE2EDuration="21.711854567s" podCreationTimestamp="2025-10-07 13:15:30 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.264383891 +0000 UTC m=+891.425106558" lastFinishedPulling="2025-10-07 13:15:49.724331514 +0000 UTC m=+901.885054191" observedRunningTime="2025-10-07 13:15:51.703573069 +0000 UTC m=+903.864295756" watchObservedRunningTime="2025-10-07 13:15:51.711854567 +0000 UTC m=+903.872577244" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.094708 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.243233 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config\") pod \"fe087602-e770-412d-9795-9dc51a94f267\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.244421 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc\") pod \"fe087602-e770-412d-9795-9dc51a94f267\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.244480 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hmbw\" (UniqueName: \"kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw\") pod \"fe087602-e770-412d-9795-9dc51a94f267\" (UID: \"fe087602-e770-412d-9795-9dc51a94f267\") " Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.250889 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw" (OuterVolumeSpecName: "kube-api-access-2hmbw") pod "fe087602-e770-412d-9795-9dc51a94f267" (UID: "fe087602-e770-412d-9795-9dc51a94f267"). InnerVolumeSpecName "kube-api-access-2hmbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.278410 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe087602-e770-412d-9795-9dc51a94f267" (UID: "fe087602-e770-412d-9795-9dc51a94f267"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.304599 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config" (OuterVolumeSpecName: "config") pod "fe087602-e770-412d-9795-9dc51a94f267" (UID: "fe087602-e770-412d-9795-9dc51a94f267"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.347450 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.347492 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe087602-e770-412d-9795-9dc51a94f267-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.347505 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hmbw\" (UniqueName: \"kubernetes.io/projected/fe087602-e770-412d-9795-9dc51a94f267-kube-api-access-2hmbw\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.621771 4959 generic.go:334] "Generic (PLEG): container finished" podID="ba866f69-2f83-4b66-b1af-693f07c437e0" containerID="258bf1697930f1b9d261b529af27ced5132a73a27da717ecc0808556cc89dbf2" exitCode=0 Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.621834 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4nl8g" event={"ID":"ba866f69-2f83-4b66-b1af-693f07c437e0","Type":"ContainerDied","Data":"258bf1697930f1b9d261b529af27ced5132a73a27da717ecc0808556cc89dbf2"} Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.623443 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e980567-4b6d-474f-ae89-3dc436ebf1a5","Type":"ContainerStarted","Data":"493c90fe84e0b2ebe7b3f752f1af667338578a8b99bc7b3e7f5e7b11fcc708c7"} Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.627772 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerStarted","Data":"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6"} Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.630739 4959 generic.go:334] "Generic (PLEG): container finished" podID="fe087602-e770-412d-9795-9dc51a94f267" containerID="bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548" exitCode=0 Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.631235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" event={"ID":"fe087602-e770-412d-9795-9dc51a94f267","Type":"ContainerDied","Data":"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548"} Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.631258 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" event={"ID":"fe087602-e770-412d-9795-9dc51a94f267","Type":"ContainerDied","Data":"68afee6a7463174ca65169f463459070b4cdff96f7cd457bc82b23ab30b391e4"} Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.631275 4959 scope.go:117] "RemoveContainer" containerID="bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.632553 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8575fc99d7-xc6jp" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.669694 4959 scope.go:117] "RemoveContainer" containerID="b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.708392 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.712898 4959 scope.go:117] "RemoveContainer" containerID="bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.713266 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8575fc99d7-xc6jp"] Oct 07 13:15:52 crc kubenswrapper[4959]: E1007 13:15:52.713312 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548\": container with ID starting with bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548 not found: ID does not exist" containerID="bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.713342 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548"} err="failed to get container status \"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548\": rpc error: code = NotFound desc = could not find container \"bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548\": container with ID starting with bfe51e7f36c0c3f4c82c6a6ad4ec63858a70e2fcc1b98fd9e7bc237ff36c5548 not found: ID does not exist" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.713368 4959 scope.go:117] "RemoveContainer" containerID="b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7" Oct 07 13:15:52 crc kubenswrapper[4959]: E1007 13:15:52.713717 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7\": container with ID starting with b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7 not found: ID does not exist" containerID="b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.713740 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7"} err="failed to get container status \"b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7\": rpc error: code = NotFound desc = could not find container \"b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7\": container with ID starting with b193d54f8ca91af019ca8bec8f1e2a6f65a902c5b1094fb9cbc6b2a7362a9ef7 not found: ID does not exist" Oct 07 13:15:52 crc kubenswrapper[4959]: I1007 13:15:52.818012 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe087602-e770-412d-9795-9dc51a94f267" path="/var/lib/kubelet/pods/fe087602-e770-412d-9795-9dc51a94f267/volumes" Oct 07 13:15:53 crc kubenswrapper[4959]: I1007 13:15:53.640679 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4nl8g" event={"ID":"ba866f69-2f83-4b66-b1af-693f07c437e0","Type":"ContainerStarted","Data":"5b2c92613167b79fd114b5c83f964b4d6d7de9029db71c1765224957eb06c704"} Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.651475 4959 generic.go:334] "Generic (PLEG): container finished" podID="fed91ea6-e906-47c4-84e0-123c01a9780d" containerID="6470c8b80bb91126aa97401fcb8cec84d46074393bbd43d25031bc8276804f32" exitCode=0 Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.651556 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fed91ea6-e906-47c4-84e0-123c01a9780d","Type":"ContainerDied","Data":"6470c8b80bb91126aa97401fcb8cec84d46074393bbd43d25031bc8276804f32"} Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.654759 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4nl8g" event={"ID":"ba866f69-2f83-4b66-b1af-693f07c437e0","Type":"ContainerStarted","Data":"ea501a816deca82eb14a8b8a0938826e261994b7ed6a68a6f9b50e420d09a39f"} Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.654879 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.654914 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.657072 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"26d915bf-8d27-4349-9a3b-f13f13809cf5","Type":"ContainerStarted","Data":"d6ccc55f8d2943a9a4e39d431def6f78b4f083a33e81c8272931e7c350eec23d"} Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.659115 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"151d32f4-496d-43a0-aeb7-ee999d5faeef","Type":"ContainerStarted","Data":"7ec57f2883f9c7ea90fda45f133340a3abdba8384eed2b3594181e68a53cbf84"} Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.705469 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.850264441 podStartE2EDuration="19.705452294s" podCreationTimestamp="2025-10-07 13:15:35 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.350524784 +0000 UTC m=+891.511247461" lastFinishedPulling="2025-10-07 13:15:54.205712637 +0000 UTC m=+906.366435314" observedRunningTime="2025-10-07 13:15:54.699967533 +0000 UTC m=+906.860690230" watchObservedRunningTime="2025-10-07 13:15:54.705452294 +0000 UTC m=+906.866174971" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.730767 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4nl8g" podStartSLOduration=8.682608029 podStartE2EDuration="18.730750834s" podCreationTimestamp="2025-10-07 13:15:36 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.676187529 +0000 UTC m=+891.836910206" lastFinishedPulling="2025-10-07 13:15:49.724330324 +0000 UTC m=+901.885053011" observedRunningTime="2025-10-07 13:15:54.724283075 +0000 UTC m=+906.885005762" watchObservedRunningTime="2025-10-07 13:15:54.730750834 +0000 UTC m=+906.891473511" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.745156 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.971202139 podStartE2EDuration="16.745135065s" podCreationTimestamp="2025-10-07 13:15:38 +0000 UTC" firstStartedPulling="2025-10-07 13:15:40.424034104 +0000 UTC m=+892.584756781" lastFinishedPulling="2025-10-07 13:15:54.19796703 +0000 UTC m=+906.358689707" observedRunningTime="2025-10-07 13:15:54.741341074 +0000 UTC m=+906.902063751" watchObservedRunningTime="2025-10-07 13:15:54.745135065 +0000 UTC m=+906.905857752" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.960673 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.961038 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:54 crc kubenswrapper[4959]: I1007 13:15:54.996379 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.351174 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.398295 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.672184 4959 generic.go:334] "Generic (PLEG): container finished" podID="5e980567-4b6d-474f-ae89-3dc436ebf1a5" containerID="493c90fe84e0b2ebe7b3f752f1af667338578a8b99bc7b3e7f5e7b11fcc708c7" exitCode=0 Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.672255 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e980567-4b6d-474f-ae89-3dc436ebf1a5","Type":"ContainerDied","Data":"493c90fe84e0b2ebe7b3f752f1af667338578a8b99bc7b3e7f5e7b11fcc708c7"} Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.674903 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fed91ea6-e906-47c4-84e0-123c01a9780d","Type":"ContainerStarted","Data":"36bb10f2a27ee831eb65d3d5476f270474f6bb830f424f1fdbff77476353befc"} Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.675803 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.749067 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.128354522 podStartE2EDuration="26.749044858s" podCreationTimestamp="2025-10-07 13:15:29 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.103681339 +0000 UTC m=+891.264404026" lastFinishedPulling="2025-10-07 13:15:49.724371685 +0000 UTC m=+901.885094362" observedRunningTime="2025-10-07 13:15:55.74295952 +0000 UTC m=+907.903682207" watchObservedRunningTime="2025-10-07 13:15:55.749044858 +0000 UTC m=+907.909767535" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.758212 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 13:15:55 crc kubenswrapper[4959]: I1007 13:15:55.981788 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.022153 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:56 crc kubenswrapper[4959]: E1007 13:15:56.022535 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="dnsmasq-dns" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.022559 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="dnsmasq-dns" Oct 07 13:15:56 crc kubenswrapper[4959]: E1007 13:15:56.022589 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="init" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.022598 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="init" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.022831 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe087602-e770-412d-9795-9dc51a94f267" containerName="dnsmasq-dns" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.024062 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.031078 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.031977 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.072969 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lqwt9"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.074158 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.077412 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.095435 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lqwt9"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208554 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-combined-ca-bundle\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208593 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovs-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208616 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208690 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9m77\" (UniqueName: \"kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208755 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208812 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9002d-236f-4dc3-947e-98e1e4e535c1-config\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208855 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmpgb\" (UniqueName: \"kubernetes.io/projected/dde9002d-236f-4dc3-947e-98e1e4e535c1-kube-api-access-vmpgb\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.208963 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovn-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.209011 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.209145 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310519 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9002d-236f-4dc3-947e-98e1e4e535c1-config\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310580 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmpgb\" (UniqueName: \"kubernetes.io/projected/dde9002d-236f-4dc3-947e-98e1e4e535c1-kube-api-access-vmpgb\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310599 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovn-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310618 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310678 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310706 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-combined-ca-bundle\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.311030 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovs-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.310993 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovn-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.311119 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.311193 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dde9002d-236f-4dc3-947e-98e1e4e535c1-ovs-rundir\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.311236 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9m77\" (UniqueName: \"kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.312145 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.312337 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.312785 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.312824 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9002d-236f-4dc3-947e-98e1e4e535c1-config\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.313442 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.315833 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.315927 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde9002d-236f-4dc3-947e-98e1e4e535c1-combined-ca-bundle\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.339458 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9m77\" (UniqueName: \"kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77\") pod \"dnsmasq-dns-545fb8c44f-hscmd\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.351533 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmpgb\" (UniqueName: \"kubernetes.io/projected/dde9002d-236f-4dc3-947e-98e1e4e535c1-kube-api-access-vmpgb\") pod \"ovn-controller-metrics-lqwt9\" (UID: \"dde9002d-236f-4dc3-947e-98e1e4e535c1\") " pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.354420 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.372855 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.407892 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lqwt9" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.410544 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.414067 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.417335 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.431232 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.514402 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.514447 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.514520 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.514560 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.514582 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msfz\" (UniqueName: \"kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.615472 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msfz\" (UniqueName: \"kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.615524 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.615565 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.615675 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.615736 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.616677 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.616758 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.619580 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.619650 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.633697 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msfz\" (UniqueName: \"kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz\") pod \"dnsmasq-dns-dc9d58d7-glb82\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.684615 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e980567-4b6d-474f-ae89-3dc436ebf1a5","Type":"ContainerStarted","Data":"61fd55a09f4b077cdccbbbe3907eae19c3a92c84d4f705ade00d43d8f540b693"} Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.707986 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.919155673 podStartE2EDuration="28.707969814s" podCreationTimestamp="2025-10-07 13:15:28 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.290853284 +0000 UTC m=+891.451575961" lastFinishedPulling="2025-10-07 13:15:50.079667425 +0000 UTC m=+902.240390102" observedRunningTime="2025-10-07 13:15:56.707000476 +0000 UTC m=+908.867723173" watchObservedRunningTime="2025-10-07 13:15:56.707969814 +0000 UTC m=+908.868692491" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.729391 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.792533 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.857380 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lqwt9"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.879904 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.881200 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.884840 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.890002 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 13:15:56 crc kubenswrapper[4959]: W1007 13:15:56.895443 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8012c735_f4c9_4b40_b5b8_1934efec30c5.slice/crio-207d6db90155ff99f1fa6c5d60ee2e5d92a3fa087f3db41a9032f6cf5b63752d WatchSource:0}: Error finding container 207d6db90155ff99f1fa6c5d60ee2e5d92a3fa087f3db41a9032f6cf5b63752d: Status 404 returned error can't find the container with id 207d6db90155ff99f1fa6c5d60ee2e5d92a3fa087f3db41a9032f6cf5b63752d Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.895554 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qkzx6" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.895808 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.896007 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 13:15:56 crc kubenswrapper[4959]: I1007 13:15:56.898025 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.020565 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8qdr\" (UniqueName: \"kubernetes.io/projected/b18eda78-12ab-4cb2-ac1c-56907a2b4667-kube-api-access-j8qdr\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.020913 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.020958 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-scripts\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.020982 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.021005 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.021036 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-config\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.021054 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122793 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-scripts\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122861 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122896 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-config\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122914 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122957 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8qdr\" (UniqueName: \"kubernetes.io/projected/b18eda78-12ab-4cb2-ac1c-56907a2b4667-kube-api-access-j8qdr\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.122992 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.123550 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.123808 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-scripts\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.124143 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b18eda78-12ab-4cb2-ac1c-56907a2b4667-config\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.128795 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.130267 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.130439 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18eda78-12ab-4cb2-ac1c-56907a2b4667-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.140193 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8qdr\" (UniqueName: \"kubernetes.io/projected/b18eda78-12ab-4cb2-ac1c-56907a2b4667-kube-api-access-j8qdr\") pod \"ovn-northd-0\" (UID: \"b18eda78-12ab-4cb2-ac1c-56907a2b4667\") " pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.212886 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 13:15:57 crc kubenswrapper[4959]: W1007 13:15:57.294987 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcacf79ac_7a79_42e7_83c7_169654627df8.slice/crio-9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf WatchSource:0}: Error finding container 9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf: Status 404 returned error can't find the container with id 9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.295472 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.662590 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:15:57 crc kubenswrapper[4959]: W1007 13:15:57.666907 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18eda78_12ab_4cb2_ac1c_56907a2b4667.slice/crio-0fd6a33bf99016feac9330667071f5c04931d762a6466986f50d8ab54dec4f5a WatchSource:0}: Error finding container 0fd6a33bf99016feac9330667071f5c04931d762a6466986f50d8ab54dec4f5a: Status 404 returned error can't find the container with id 0fd6a33bf99016feac9330667071f5c04931d762a6466986f50d8ab54dec4f5a Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.694070 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lqwt9" event={"ID":"dde9002d-236f-4dc3-947e-98e1e4e535c1","Type":"ContainerStarted","Data":"9ad97a39be595b7925d41f161630e499f7b0b315fbc7f174809ef932c6366448"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.694340 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lqwt9" event={"ID":"dde9002d-236f-4dc3-947e-98e1e4e535c1","Type":"ContainerStarted","Data":"57645a9946735ae70be9953629be1e163b8f1f2d98cc2eb9fbae914cb43f0f07"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.696280 4959 generic.go:334] "Generic (PLEG): container finished" podID="cacf79ac-7a79-42e7-83c7-169654627df8" containerID="fa70e3ec1034497a65f3a67ffe75ab8b403142fedb0964f38df8563919af4e44" exitCode=0 Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.696363 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" event={"ID":"cacf79ac-7a79-42e7-83c7-169654627df8","Type":"ContainerDied","Data":"fa70e3ec1034497a65f3a67ffe75ab8b403142fedb0964f38df8563919af4e44"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.696394 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" event={"ID":"cacf79ac-7a79-42e7-83c7-169654627df8","Type":"ContainerStarted","Data":"9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.697666 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b18eda78-12ab-4cb2-ac1c-56907a2b4667","Type":"ContainerStarted","Data":"0fd6a33bf99016feac9330667071f5c04931d762a6466986f50d8ab54dec4f5a"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.700218 4959 generic.go:334] "Generic (PLEG): container finished" podID="8012c735-f4c9-4b40-b5b8-1934efec30c5" containerID="e181b7958072867164304f2dc4951ac8f3b0a10794b2e064ff61fe66c1a2d4d9" exitCode=0 Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.700437 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" event={"ID":"8012c735-f4c9-4b40-b5b8-1934efec30c5","Type":"ContainerDied","Data":"e181b7958072867164304f2dc4951ac8f3b0a10794b2e064ff61fe66c1a2d4d9"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.700480 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" event={"ID":"8012c735-f4c9-4b40-b5b8-1934efec30c5","Type":"ContainerStarted","Data":"207d6db90155ff99f1fa6c5d60ee2e5d92a3fa087f3db41a9032f6cf5b63752d"} Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.720872 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lqwt9" podStartSLOduration=1.720845808 podStartE2EDuration="1.720845808s" podCreationTimestamp="2025-10-07 13:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:15:57.710787434 +0000 UTC m=+909.871510121" watchObservedRunningTime="2025-10-07 13:15:57.720845808 +0000 UTC m=+909.881568485" Oct 07 13:15:57 crc kubenswrapper[4959]: I1007 13:15:57.972128 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.047196 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config\") pod \"8012c735-f4c9-4b40-b5b8-1934efec30c5\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.047279 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb\") pod \"8012c735-f4c9-4b40-b5b8-1934efec30c5\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.047341 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc\") pod \"8012c735-f4c9-4b40-b5b8-1934efec30c5\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.047409 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9m77\" (UniqueName: \"kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77\") pod \"8012c735-f4c9-4b40-b5b8-1934efec30c5\" (UID: \"8012c735-f4c9-4b40-b5b8-1934efec30c5\") " Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.057524 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77" (OuterVolumeSpecName: "kube-api-access-p9m77") pod "8012c735-f4c9-4b40-b5b8-1934efec30c5" (UID: "8012c735-f4c9-4b40-b5b8-1934efec30c5"). InnerVolumeSpecName "kube-api-access-p9m77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.080478 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8012c735-f4c9-4b40-b5b8-1934efec30c5" (UID: "8012c735-f4c9-4b40-b5b8-1934efec30c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.098505 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config" (OuterVolumeSpecName: "config") pod "8012c735-f4c9-4b40-b5b8-1934efec30c5" (UID: "8012c735-f4c9-4b40-b5b8-1934efec30c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.114520 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8012c735-f4c9-4b40-b5b8-1934efec30c5" (UID: "8012c735-f4c9-4b40-b5b8-1934efec30c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.151746 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9m77\" (UniqueName: \"kubernetes.io/projected/8012c735-f4c9-4b40-b5b8-1934efec30c5-kube-api-access-p9m77\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.151817 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.151832 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.151843 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8012c735-f4c9-4b40-b5b8-1934efec30c5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.715449 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" event={"ID":"cacf79ac-7a79-42e7-83c7-169654627df8","Type":"ContainerStarted","Data":"bd66846d6a2fe57e5218379116787a21a12ddf9bbbc1d6cce6e5ee567e72e0af"} Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.716901 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.726344 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" event={"ID":"8012c735-f4c9-4b40-b5b8-1934efec30c5","Type":"ContainerDied","Data":"207d6db90155ff99f1fa6c5d60ee2e5d92a3fa087f3db41a9032f6cf5b63752d"} Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.726414 4959 scope.go:117] "RemoveContainer" containerID="e181b7958072867164304f2dc4951ac8f3b0a10794b2e064ff61fe66c1a2d4d9" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.726412 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545fb8c44f-hscmd" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.749441 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" podStartSLOduration=2.7494215029999998 podStartE2EDuration="2.749421503s" podCreationTimestamp="2025-10-07 13:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:15:58.744025385 +0000 UTC m=+910.904748102" watchObservedRunningTime="2025-10-07 13:15:58.749421503 +0000 UTC m=+910.910144170" Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.878716 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:58 crc kubenswrapper[4959]: I1007 13:15:58.883334 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545fb8c44f-hscmd"] Oct 07 13:15:59 crc kubenswrapper[4959]: I1007 13:15:59.737365 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b18eda78-12ab-4cb2-ac1c-56907a2b4667","Type":"ContainerStarted","Data":"26a9870b705c784cefe64c7c99461cdd30164259c73c17eb1ca3feaac37587f4"} Oct 07 13:15:59 crc kubenswrapper[4959]: I1007 13:15:59.737725 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b18eda78-12ab-4cb2-ac1c-56907a2b4667","Type":"ContainerStarted","Data":"c37b6cd06c904515b35052d256d9ea5f34d1c9f04bb4f6fc3beb8b1ff7d6b499"} Oct 07 13:15:59 crc kubenswrapper[4959]: I1007 13:15:59.737928 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 13:15:59 crc kubenswrapper[4959]: I1007 13:15:59.758658 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.775145676 podStartE2EDuration="3.758639351s" podCreationTimestamp="2025-10-07 13:15:56 +0000 UTC" firstStartedPulling="2025-10-07 13:15:57.668937609 +0000 UTC m=+909.829660286" lastFinishedPulling="2025-10-07 13:15:58.652431274 +0000 UTC m=+910.813153961" observedRunningTime="2025-10-07 13:15:59.754878771 +0000 UTC m=+911.915601478" watchObservedRunningTime="2025-10-07 13:15:59.758639351 +0000 UTC m=+911.919362028" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.499193 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.499265 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.626129 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.626635 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.685951 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.782125 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 13:16:00 crc kubenswrapper[4959]: I1007 13:16:00.828730 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8012c735-f4c9-4b40-b5b8-1934efec30c5" path="/var/lib/kubelet/pods/8012c735-f4c9-4b40-b5b8-1934efec30c5/volumes" Oct 07 13:16:02 crc kubenswrapper[4959]: I1007 13:16:02.561998 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 13:16:02 crc kubenswrapper[4959]: I1007 13:16:02.609549 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 13:16:02 crc kubenswrapper[4959]: I1007 13:16:02.785293 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.307520 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-q9ncv"] Oct 07 13:16:06 crc kubenswrapper[4959]: E1007 13:16:06.309887 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8012c735-f4c9-4b40-b5b8-1934efec30c5" containerName="init" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.310027 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8012c735-f4c9-4b40-b5b8-1934efec30c5" containerName="init" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.310651 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8012c735-f4c9-4b40-b5b8-1934efec30c5" containerName="init" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.311571 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.322845 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q9ncv"] Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.402834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtzh\" (UniqueName: \"kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh\") pod \"glance-db-create-q9ncv\" (UID: \"6679f23b-b441-424d-9203-34c965f1e655\") " pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.503835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbtzh\" (UniqueName: \"kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh\") pod \"glance-db-create-q9ncv\" (UID: \"6679f23b-b441-424d-9203-34c965f1e655\") " pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.522986 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbtzh\" (UniqueName: \"kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh\") pod \"glance-db-create-q9ncv\" (UID: \"6679f23b-b441-424d-9203-34c965f1e655\") " pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.642533 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.793870 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.850610 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:16:06 crc kubenswrapper[4959]: I1007 13:16:06.850906 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77597f887-h8vzm" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="dnsmasq-dns" containerID="cri-o://43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136" gracePeriod=10 Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.139880 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q9ncv"] Oct 07 13:16:07 crc kubenswrapper[4959]: W1007 13:16:07.160165 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6679f23b_b441_424d_9203_34c965f1e655.slice/crio-cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333 WatchSource:0}: Error finding container cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333: Status 404 returned error can't find the container with id cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333 Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.284933 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.420835 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config\") pod \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.420948 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc\") pod \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.420987 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqfqv\" (UniqueName: \"kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv\") pod \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\" (UID: \"db523e83-ee5e-42fd-acb8-4d22edd64e3d\") " Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.427822 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv" (OuterVolumeSpecName: "kube-api-access-nqfqv") pod "db523e83-ee5e-42fd-acb8-4d22edd64e3d" (UID: "db523e83-ee5e-42fd-acb8-4d22edd64e3d"). InnerVolumeSpecName "kube-api-access-nqfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.461319 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db523e83-ee5e-42fd-acb8-4d22edd64e3d" (UID: "db523e83-ee5e-42fd-acb8-4d22edd64e3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.466497 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config" (OuterVolumeSpecName: "config") pod "db523e83-ee5e-42fd-acb8-4d22edd64e3d" (UID: "db523e83-ee5e-42fd-acb8-4d22edd64e3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.523288 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.523327 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db523e83-ee5e-42fd-acb8-4d22edd64e3d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.523339 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqfqv\" (UniqueName: \"kubernetes.io/projected/db523e83-ee5e-42fd-acb8-4d22edd64e3d-kube-api-access-nqfqv\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.695256 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.695317 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.806120 4959 generic.go:334] "Generic (PLEG): container finished" podID="6679f23b-b441-424d-9203-34c965f1e655" containerID="baf04b947eb480c3db9ea8e9c55c153df1a394062e3591eedefc31a150af12ec" exitCode=0 Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.806237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q9ncv" event={"ID":"6679f23b-b441-424d-9203-34c965f1e655","Type":"ContainerDied","Data":"baf04b947eb480c3db9ea8e9c55c153df1a394062e3591eedefc31a150af12ec"} Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.806715 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q9ncv" event={"ID":"6679f23b-b441-424d-9203-34c965f1e655","Type":"ContainerStarted","Data":"cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333"} Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.808818 4959 generic.go:334] "Generic (PLEG): container finished" podID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerID="43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136" exitCode=0 Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.808842 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-h8vzm" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.808863 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-h8vzm" event={"ID":"db523e83-ee5e-42fd-acb8-4d22edd64e3d","Type":"ContainerDied","Data":"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136"} Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.809296 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-h8vzm" event={"ID":"db523e83-ee5e-42fd-acb8-4d22edd64e3d","Type":"ContainerDied","Data":"2509851f29d37a2b98bf0a1e2d63d8df194d0cd043dfdc0d0e0bfe56407fa9be"} Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.809321 4959 scope.go:117] "RemoveContainer" containerID="43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.831957 4959 scope.go:117] "RemoveContainer" containerID="3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.866764 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.869205 4959 scope.go:117] "RemoveContainer" containerID="43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136" Oct 07 13:16:07 crc kubenswrapper[4959]: E1007 13:16:07.869698 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136\": container with ID starting with 43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136 not found: ID does not exist" containerID="43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.869755 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136"} err="failed to get container status \"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136\": rpc error: code = NotFound desc = could not find container \"43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136\": container with ID starting with 43515b3e186f4b9a5ba1265f7cf478d05664b212a13b0add8bbac40f180db136 not found: ID does not exist" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.869799 4959 scope.go:117] "RemoveContainer" containerID="3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954" Oct 07 13:16:07 crc kubenswrapper[4959]: E1007 13:16:07.870541 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954\": container with ID starting with 3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954 not found: ID does not exist" containerID="3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.870676 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954"} err="failed to get container status \"3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954\": rpc error: code = NotFound desc = could not find container \"3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954\": container with ID starting with 3bb9c1ae797627d1eaf1bde010a19a5b8cf18228cc1ee8e3a12815f5f1c27954 not found: ID does not exist" Oct 07 13:16:07 crc kubenswrapper[4959]: I1007 13:16:07.879842 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-h8vzm"] Oct 07 13:16:08 crc kubenswrapper[4959]: I1007 13:16:08.819905 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" path="/var/lib/kubelet/pods/db523e83-ee5e-42fd-acb8-4d22edd64e3d/volumes" Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.163687 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.257400 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbtzh\" (UniqueName: \"kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh\") pod \"6679f23b-b441-424d-9203-34c965f1e655\" (UID: \"6679f23b-b441-424d-9203-34c965f1e655\") " Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.263581 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh" (OuterVolumeSpecName: "kube-api-access-tbtzh") pod "6679f23b-b441-424d-9203-34c965f1e655" (UID: "6679f23b-b441-424d-9203-34c965f1e655"). InnerVolumeSpecName "kube-api-access-tbtzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.359902 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbtzh\" (UniqueName: \"kubernetes.io/projected/6679f23b-b441-424d-9203-34c965f1e655-kube-api-access-tbtzh\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.827162 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q9ncv" event={"ID":"6679f23b-b441-424d-9203-34c965f1e655","Type":"ContainerDied","Data":"cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333"} Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.827207 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q9ncv" Oct 07 13:16:09 crc kubenswrapper[4959]: I1007 13:16:09.827219 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb732c655fd120bc0799c3b67c95933f8ff155d0afa53c01ad5d86766f56c333" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.557698 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-c2s2g"] Oct 07 13:16:10 crc kubenswrapper[4959]: E1007 13:16:10.558490 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="dnsmasq-dns" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.558512 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="dnsmasq-dns" Oct 07 13:16:10 crc kubenswrapper[4959]: E1007 13:16:10.558541 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6679f23b-b441-424d-9203-34c965f1e655" containerName="mariadb-database-create" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.558550 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6679f23b-b441-424d-9203-34c965f1e655" containerName="mariadb-database-create" Oct 07 13:16:10 crc kubenswrapper[4959]: E1007 13:16:10.558560 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="init" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.558566 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="init" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.558779 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="db523e83-ee5e-42fd-acb8-4d22edd64e3d" containerName="dnsmasq-dns" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.558805 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6679f23b-b441-424d-9203-34c965f1e655" containerName="mariadb-database-create" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.559396 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.583942 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c2s2g"] Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.682801 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkmm\" (UniqueName: \"kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm\") pod \"keystone-db-create-c2s2g\" (UID: \"2ae65892-33d2-4485-9586-318c8cc1b97f\") " pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.784896 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkmm\" (UniqueName: \"kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm\") pod \"keystone-db-create-c2s2g\" (UID: \"2ae65892-33d2-4485-9586-318c8cc1b97f\") " pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.807537 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkmm\" (UniqueName: \"kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm\") pod \"keystone-db-create-c2s2g\" (UID: \"2ae65892-33d2-4485-9586-318c8cc1b97f\") " pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.883781 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.888951 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8zjb9"] Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.892840 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:10 crc kubenswrapper[4959]: I1007 13:16:10.906345 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zjb9"] Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.000787 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjhq\" (UniqueName: \"kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq\") pod \"placement-db-create-8zjb9\" (UID: \"a00ade55-5009-425f-9c72-3f3b39cf32c5\") " pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.102542 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjhq\" (UniqueName: \"kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq\") pod \"placement-db-create-8zjb9\" (UID: \"a00ade55-5009-425f-9c72-3f3b39cf32c5\") " pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.129028 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjhq\" (UniqueName: \"kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq\") pod \"placement-db-create-8zjb9\" (UID: \"a00ade55-5009-425f-9c72-3f3b39cf32c5\") " pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.280125 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.334537 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c2s2g"] Oct 07 13:16:11 crc kubenswrapper[4959]: W1007 13:16:11.337507 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae65892_33d2_4485_9586_318c8cc1b97f.slice/crio-af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf WatchSource:0}: Error finding container af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf: Status 404 returned error can't find the container with id af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.680454 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zjb9"] Oct 07 13:16:11 crc kubenswrapper[4959]: W1007 13:16:11.685435 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00ade55_5009_425f_9c72_3f3b39cf32c5.slice/crio-27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85 WatchSource:0}: Error finding container 27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85: Status 404 returned error can't find the container with id 27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85 Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.843340 4959 generic.go:334] "Generic (PLEG): container finished" podID="2ae65892-33d2-4485-9586-318c8cc1b97f" containerID="7dc63e1f3e52e733bb6a6edbb21c7ac9a11b7ab55869c4a2be689879995fc866" exitCode=0 Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.843403 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2s2g" event={"ID":"2ae65892-33d2-4485-9586-318c8cc1b97f","Type":"ContainerDied","Data":"7dc63e1f3e52e733bb6a6edbb21c7ac9a11b7ab55869c4a2be689879995fc866"} Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.843454 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2s2g" event={"ID":"2ae65892-33d2-4485-9586-318c8cc1b97f","Type":"ContainerStarted","Data":"af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf"} Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.845486 4959 generic.go:334] "Generic (PLEG): container finished" podID="a00ade55-5009-425f-9c72-3f3b39cf32c5" containerID="4994e23adad59193118f7e57755c69a99e64b4489977d830f6eae4a5d5d3111b" exitCode=0 Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.845547 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zjb9" event={"ID":"a00ade55-5009-425f-9c72-3f3b39cf32c5","Type":"ContainerDied","Data":"4994e23adad59193118f7e57755c69a99e64b4489977d830f6eae4a5d5d3111b"} Oct 07 13:16:11 crc kubenswrapper[4959]: I1007 13:16:11.845584 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zjb9" event={"ID":"a00ade55-5009-425f-9c72-3f3b39cf32c5","Type":"ContainerStarted","Data":"27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85"} Oct 07 13:16:12 crc kubenswrapper[4959]: I1007 13:16:12.281133 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.201764 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.275612 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.340052 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrkmm\" (UniqueName: \"kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm\") pod \"2ae65892-33d2-4485-9586-318c8cc1b97f\" (UID: \"2ae65892-33d2-4485-9586-318c8cc1b97f\") " Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.340212 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltjhq\" (UniqueName: \"kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq\") pod \"a00ade55-5009-425f-9c72-3f3b39cf32c5\" (UID: \"a00ade55-5009-425f-9c72-3f3b39cf32c5\") " Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.346277 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm" (OuterVolumeSpecName: "kube-api-access-wrkmm") pod "2ae65892-33d2-4485-9586-318c8cc1b97f" (UID: "2ae65892-33d2-4485-9586-318c8cc1b97f"). InnerVolumeSpecName "kube-api-access-wrkmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.346347 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq" (OuterVolumeSpecName: "kube-api-access-ltjhq") pod "a00ade55-5009-425f-9c72-3f3b39cf32c5" (UID: "a00ade55-5009-425f-9c72-3f3b39cf32c5"). InnerVolumeSpecName "kube-api-access-ltjhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.443360 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrkmm\" (UniqueName: \"kubernetes.io/projected/2ae65892-33d2-4485-9586-318c8cc1b97f-kube-api-access-wrkmm\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.443832 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltjhq\" (UniqueName: \"kubernetes.io/projected/a00ade55-5009-425f-9c72-3f3b39cf32c5-kube-api-access-ltjhq\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.860978 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2s2g" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.860969 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2s2g" event={"ID":"2ae65892-33d2-4485-9586-318c8cc1b97f","Type":"ContainerDied","Data":"af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf"} Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.861148 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af65d0fb3f665106d74a0b0c1f9fe6e00271682ce3d49d06f4dee76dd7e693bf" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.863331 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zjb9" event={"ID":"a00ade55-5009-425f-9c72-3f3b39cf32c5","Type":"ContainerDied","Data":"27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85"} Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.863378 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f05a5392aed25b827dd50a8d20b14e88eadd80f6d526e99a2523f08fbb3d85" Oct 07 13:16:13 crc kubenswrapper[4959]: I1007 13:16:13.863382 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zjb9" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.313406 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3992-account-create-p5rn5"] Oct 07 13:16:16 crc kubenswrapper[4959]: E1007 13:16:16.314215 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae65892-33d2-4485-9586-318c8cc1b97f" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.314236 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae65892-33d2-4485-9586-318c8cc1b97f" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: E1007 13:16:16.314280 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00ade55-5009-425f-9c72-3f3b39cf32c5" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.314293 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00ade55-5009-425f-9c72-3f3b39cf32c5" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.314622 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00ade55-5009-425f-9c72-3f3b39cf32c5" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.314826 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae65892-33d2-4485-9586-318c8cc1b97f" containerName="mariadb-database-create" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.315615 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.318728 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.328886 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3992-account-create-p5rn5"] Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.394099 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnbd\" (UniqueName: \"kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd\") pod \"glance-3992-account-create-p5rn5\" (UID: \"10da62a9-350c-4749-b6d7-481dc5557926\") " pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.496272 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnbd\" (UniqueName: \"kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd\") pod \"glance-3992-account-create-p5rn5\" (UID: \"10da62a9-350c-4749-b6d7-481dc5557926\") " pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.518515 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnbd\" (UniqueName: \"kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd\") pod \"glance-3992-account-create-p5rn5\" (UID: \"10da62a9-350c-4749-b6d7-481dc5557926\") " pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:16 crc kubenswrapper[4959]: I1007 13:16:16.639956 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:17 crc kubenswrapper[4959]: I1007 13:16:17.101266 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3992-account-create-p5rn5"] Oct 07 13:16:17 crc kubenswrapper[4959]: W1007 13:16:17.102305 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10da62a9_350c_4749_b6d7_481dc5557926.slice/crio-5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f WatchSource:0}: Error finding container 5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f: Status 404 returned error can't find the container with id 5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f Oct 07 13:16:17 crc kubenswrapper[4959]: I1007 13:16:17.894444 4959 generic.go:334] "Generic (PLEG): container finished" podID="10da62a9-350c-4749-b6d7-481dc5557926" containerID="0d9b2c9488913ed0ad20bd2ca888538296debc9f01a93aff8617f44dedccc919" exitCode=0 Oct 07 13:16:17 crc kubenswrapper[4959]: I1007 13:16:17.894749 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3992-account-create-p5rn5" event={"ID":"10da62a9-350c-4749-b6d7-481dc5557926","Type":"ContainerDied","Data":"0d9b2c9488913ed0ad20bd2ca888538296debc9f01a93aff8617f44dedccc919"} Oct 07 13:16:17 crc kubenswrapper[4959]: I1007 13:16:17.894777 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3992-account-create-p5rn5" event={"ID":"10da62a9-350c-4749-b6d7-481dc5557926","Type":"ContainerStarted","Data":"5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f"} Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.225799 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.344641 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wnbd\" (UniqueName: \"kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd\") pod \"10da62a9-350c-4749-b6d7-481dc5557926\" (UID: \"10da62a9-350c-4749-b6d7-481dc5557926\") " Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.350286 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd" (OuterVolumeSpecName: "kube-api-access-5wnbd") pod "10da62a9-350c-4749-b6d7-481dc5557926" (UID: "10da62a9-350c-4749-b6d7-481dc5557926"). InnerVolumeSpecName "kube-api-access-5wnbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.446685 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wnbd\" (UniqueName: \"kubernetes.io/projected/10da62a9-350c-4749-b6d7-481dc5557926-kube-api-access-5wnbd\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.911948 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3992-account-create-p5rn5" event={"ID":"10da62a9-350c-4749-b6d7-481dc5557926","Type":"ContainerDied","Data":"5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f"} Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.912290 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5555e62374417efe5ce39e930c527808ef44f7c57c85485a63a8204d1544a15f" Oct 07 13:16:19 crc kubenswrapper[4959]: I1007 13:16:19.912065 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3992-account-create-p5rn5" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.671459 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8d28-account-create-rk7w2"] Oct 07 13:16:20 crc kubenswrapper[4959]: E1007 13:16:20.671842 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10da62a9-350c-4749-b6d7-481dc5557926" containerName="mariadb-account-create" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.671856 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="10da62a9-350c-4749-b6d7-481dc5557926" containerName="mariadb-account-create" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.672037 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="10da62a9-350c-4749-b6d7-481dc5557926" containerName="mariadb-account-create" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.672478 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.674080 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.687198 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8d28-account-create-rk7w2"] Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.766121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll62b\" (UniqueName: \"kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b\") pod \"keystone-8d28-account-create-rk7w2\" (UID: \"435f9254-a99d-4d4a-831e-481261eb91b1\") " pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.867890 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll62b\" (UniqueName: \"kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b\") pod \"keystone-8d28-account-create-rk7w2\" (UID: \"435f9254-a99d-4d4a-831e-481261eb91b1\") " pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.911149 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll62b\" (UniqueName: \"kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b\") pod \"keystone-8d28-account-create-rk7w2\" (UID: \"435f9254-a99d-4d4a-831e-481261eb91b1\") " pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:20 crc kubenswrapper[4959]: I1007 13:16:20.994005 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.065210 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a896-account-create-6kkkh"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.066362 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.069336 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.071675 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a896-account-create-6kkkh"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.172978 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxbk\" (UniqueName: \"kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk\") pod \"placement-a896-account-create-6kkkh\" (UID: \"0fb5bed5-58f4-47f5-9594-4bb0a951afa9\") " pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.275265 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxbk\" (UniqueName: \"kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk\") pod \"placement-a896-account-create-6kkkh\" (UID: \"0fb5bed5-58f4-47f5-9594-4bb0a951afa9\") " pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.293467 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxbk\" (UniqueName: \"kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk\") pod \"placement-a896-account-create-6kkkh\" (UID: \"0fb5bed5-58f4-47f5-9594-4bb0a951afa9\") " pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.421937 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.462827 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-45zn7"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.464344 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.470718 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8d28-account-create-rk7w2"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.474931 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fvw77" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.474956 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.480982 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-45zn7"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.487200 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-z8f9v" podUID="907772e5-2f0c-4478-9d3b-8f82eec8f258" containerName="ovn-controller" probeResult="failure" output=< Oct 07 13:16:21 crc kubenswrapper[4959]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 13:16:21 crc kubenswrapper[4959]: > Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.579686 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.580075 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.580101 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.580143 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9snf\" (UniqueName: \"kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.681599 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.681696 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.681721 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.681764 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9snf\" (UniqueName: \"kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.686383 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.687564 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.688101 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.697435 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9snf\" (UniqueName: \"kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf\") pod \"glance-db-sync-45zn7\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.858666 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.918894 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a896-account-create-6kkkh"] Oct 07 13:16:21 crc kubenswrapper[4959]: I1007 13:16:21.938355 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d28-account-create-rk7w2" event={"ID":"435f9254-a99d-4d4a-831e-481261eb91b1","Type":"ContainerStarted","Data":"8350ffa4c46decc090eae8cb6b113519365d4905a754c38895e8951e5a3b2525"} Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.347781 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-45zn7"] Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.951695 4959 generic.go:334] "Generic (PLEG): container finished" podID="435f9254-a99d-4d4a-831e-481261eb91b1" containerID="b82918d4f777b85c6f1f5ec435fc596dd1e6e4f5a09c2dbd2e07e0f6494e4247" exitCode=0 Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.951793 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d28-account-create-rk7w2" event={"ID":"435f9254-a99d-4d4a-831e-481261eb91b1","Type":"ContainerDied","Data":"b82918d4f777b85c6f1f5ec435fc596dd1e6e4f5a09c2dbd2e07e0f6494e4247"} Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.955165 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-45zn7" event={"ID":"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4","Type":"ContainerStarted","Data":"cae5fe1fab7ad6a0b7fe1920cb63560649c0f311978206e43aa9699fad23584b"} Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.957335 4959 generic.go:334] "Generic (PLEG): container finished" podID="0fb5bed5-58f4-47f5-9594-4bb0a951afa9" containerID="e63e589e33b68dbf95af6e5ba72c8d0fd46b1557434da8fd8f1c44f65d251292" exitCode=0 Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.957373 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a896-account-create-6kkkh" event={"ID":"0fb5bed5-58f4-47f5-9594-4bb0a951afa9","Type":"ContainerDied","Data":"e63e589e33b68dbf95af6e5ba72c8d0fd46b1557434da8fd8f1c44f65d251292"} Oct 07 13:16:22 crc kubenswrapper[4959]: I1007 13:16:22.957393 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a896-account-create-6kkkh" event={"ID":"0fb5bed5-58f4-47f5-9594-4bb0a951afa9","Type":"ContainerStarted","Data":"618708f722c78cf499de7ace02b0f687013517e94d64fe0032b6b09eb9bbfc09"} Oct 07 13:16:23 crc kubenswrapper[4959]: I1007 13:16:23.968941 4959 generic.go:334] "Generic (PLEG): container finished" podID="8703d817-5027-4394-a52d-a895f7e0fd10" containerID="8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd" exitCode=0 Oct 07 13:16:23 crc kubenswrapper[4959]: I1007 13:16:23.969022 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerDied","Data":"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd"} Oct 07 13:16:23 crc kubenswrapper[4959]: I1007 13:16:23.975272 4959 generic.go:334] "Generic (PLEG): container finished" podID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerID="fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6" exitCode=0 Oct 07 13:16:23 crc kubenswrapper[4959]: I1007 13:16:23.975480 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerDied","Data":"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6"} Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.352347 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.387425 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.443147 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll62b\" (UniqueName: \"kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b\") pod \"435f9254-a99d-4d4a-831e-481261eb91b1\" (UID: \"435f9254-a99d-4d4a-831e-481261eb91b1\") " Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.443487 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxbk\" (UniqueName: \"kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk\") pod \"0fb5bed5-58f4-47f5-9594-4bb0a951afa9\" (UID: \"0fb5bed5-58f4-47f5-9594-4bb0a951afa9\") " Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.447525 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk" (OuterVolumeSpecName: "kube-api-access-vwxbk") pod "0fb5bed5-58f4-47f5-9594-4bb0a951afa9" (UID: "0fb5bed5-58f4-47f5-9594-4bb0a951afa9"). InnerVolumeSpecName "kube-api-access-vwxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.447653 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b" (OuterVolumeSpecName: "kube-api-access-ll62b") pod "435f9254-a99d-4d4a-831e-481261eb91b1" (UID: "435f9254-a99d-4d4a-831e-481261eb91b1"). InnerVolumeSpecName "kube-api-access-ll62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.545814 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll62b\" (UniqueName: \"kubernetes.io/projected/435f9254-a99d-4d4a-831e-481261eb91b1-kube-api-access-ll62b\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.545854 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwxbk\" (UniqueName: \"kubernetes.io/projected/0fb5bed5-58f4-47f5-9594-4bb0a951afa9-kube-api-access-vwxbk\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.985427 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d28-account-create-rk7w2" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.985419 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d28-account-create-rk7w2" event={"ID":"435f9254-a99d-4d4a-831e-481261eb91b1","Type":"ContainerDied","Data":"8350ffa4c46decc090eae8cb6b113519365d4905a754c38895e8951e5a3b2525"} Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.985661 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8350ffa4c46decc090eae8cb6b113519365d4905a754c38895e8951e5a3b2525" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.989738 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerStarted","Data":"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648"} Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.990782 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.992770 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerStarted","Data":"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554"} Oct 07 13:16:24 crc kubenswrapper[4959]: I1007 13:16:24.993386 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:16:25 crc kubenswrapper[4959]: I1007 13:16:24.996268 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a896-account-create-6kkkh" event={"ID":"0fb5bed5-58f4-47f5-9594-4bb0a951afa9","Type":"ContainerDied","Data":"618708f722c78cf499de7ace02b0f687013517e94d64fe0032b6b09eb9bbfc09"} Oct 07 13:16:25 crc kubenswrapper[4959]: I1007 13:16:24.996296 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="618708f722c78cf499de7ace02b0f687013517e94d64fe0032b6b09eb9bbfc09" Oct 07 13:16:25 crc kubenswrapper[4959]: I1007 13:16:24.996344 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a896-account-create-6kkkh" Oct 07 13:16:25 crc kubenswrapper[4959]: I1007 13:16:25.016030 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.027068015 podStartE2EDuration="59.016011445s" podCreationTimestamp="2025-10-07 13:15:26 +0000 UTC" firstStartedPulling="2025-10-07 13:15:38.430890767 +0000 UTC m=+890.591613444" lastFinishedPulling="2025-10-07 13:15:49.419834127 +0000 UTC m=+901.580556874" observedRunningTime="2025-10-07 13:16:25.012691078 +0000 UTC m=+937.173413765" watchObservedRunningTime="2025-10-07 13:16:25.016011445 +0000 UTC m=+937.176734122" Oct 07 13:16:25 crc kubenswrapper[4959]: I1007 13:16:25.042428 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.44004027 podStartE2EDuration="59.042409168s" podCreationTimestamp="2025-10-07 13:15:26 +0000 UTC" firstStartedPulling="2025-10-07 13:15:39.289205866 +0000 UTC m=+891.449928543" lastFinishedPulling="2025-10-07 13:15:49.891574764 +0000 UTC m=+902.052297441" observedRunningTime="2025-10-07 13:16:25.040329727 +0000 UTC m=+937.201052414" watchObservedRunningTime="2025-10-07 13:16:25.042409168 +0000 UTC m=+937.203131845" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.430463 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-z8f9v" podUID="907772e5-2f0c-4478-9d3b-8f82eec8f258" containerName="ovn-controller" probeResult="failure" output=< Oct 07 13:16:26 crc kubenswrapper[4959]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 13:16:26 crc kubenswrapper[4959]: > Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.485292 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.486554 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4nl8g" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.715362 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-z8f9v-config-ldbvj"] Oct 07 13:16:26 crc kubenswrapper[4959]: E1007 13:16:26.715861 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435f9254-a99d-4d4a-831e-481261eb91b1" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.715884 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="435f9254-a99d-4d4a-831e-481261eb91b1" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: E1007 13:16:26.715904 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb5bed5-58f4-47f5-9594-4bb0a951afa9" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.715911 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb5bed5-58f4-47f5-9594-4bb0a951afa9" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.716100 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="435f9254-a99d-4d4a-831e-481261eb91b1" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.716118 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb5bed5-58f4-47f5-9594-4bb0a951afa9" containerName="mariadb-account-create" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.716734 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.718512 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.728457 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z8f9v-config-ldbvj"] Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779749 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779799 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779866 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779934 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xwh\" (UniqueName: \"kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779964 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.779981 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.880997 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881074 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881143 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881209 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xwh\" (UniqueName: \"kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881248 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881266 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.881584 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.882898 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.882930 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.883696 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.884484 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:26 crc kubenswrapper[4959]: I1007 13:16:26.913818 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xwh\" (UniqueName: \"kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh\") pod \"ovn-controller-z8f9v-config-ldbvj\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:27 crc kubenswrapper[4959]: I1007 13:16:27.032377 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:27 crc kubenswrapper[4959]: I1007 13:16:27.311178 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z8f9v-config-ldbvj"] Oct 07 13:16:27 crc kubenswrapper[4959]: W1007 13:16:27.324062 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda790ce30_95a9_48e9_b492_be83a63251f2.slice/crio-761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b WatchSource:0}: Error finding container 761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b: Status 404 returned error can't find the container with id 761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b Oct 07 13:16:28 crc kubenswrapper[4959]: I1007 13:16:28.024273 4959 generic.go:334] "Generic (PLEG): container finished" podID="a790ce30-95a9-48e9-b492-be83a63251f2" containerID="7e3b10863cc41b61e8cca163942242905c547c11f4aa3798b7caf5f91aabb710" exitCode=0 Oct 07 13:16:28 crc kubenswrapper[4959]: I1007 13:16:28.024312 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z8f9v-config-ldbvj" event={"ID":"a790ce30-95a9-48e9-b492-be83a63251f2","Type":"ContainerDied","Data":"7e3b10863cc41b61e8cca163942242905c547c11f4aa3798b7caf5f91aabb710"} Oct 07 13:16:28 crc kubenswrapper[4959]: I1007 13:16:28.024565 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z8f9v-config-ldbvj" event={"ID":"a790ce30-95a9-48e9-b492-be83a63251f2","Type":"ContainerStarted","Data":"761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b"} Oct 07 13:16:31 crc kubenswrapper[4959]: I1007 13:16:31.436230 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-z8f9v" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.658906 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.739897 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740173 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740192 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740217 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740249 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xwh\" (UniqueName: \"kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740301 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn\") pod \"a790ce30-95a9-48e9-b492-be83a63251f2\" (UID: \"a790ce30-95a9-48e9-b492-be83a63251f2\") " Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740506 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740522 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740541 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.740549 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run" (OuterVolumeSpecName: "var-run") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.741013 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts" (OuterVolumeSpecName: "scripts") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.744695 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh" (OuterVolumeSpecName: "kube-api-access-49xwh") pod "a790ce30-95a9-48e9-b492-be83a63251f2" (UID: "a790ce30-95a9-48e9-b492-be83a63251f2"). InnerVolumeSpecName "kube-api-access-49xwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844405 4959 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844456 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844477 4959 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844499 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49xwh\" (UniqueName: \"kubernetes.io/projected/a790ce30-95a9-48e9-b492-be83a63251f2-kube-api-access-49xwh\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844530 4959 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a790ce30-95a9-48e9-b492-be83a63251f2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:34 crc kubenswrapper[4959]: I1007 13:16:34.844550 4959 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a790ce30-95a9-48e9-b492-be83a63251f2-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:35 crc kubenswrapper[4959]: I1007 13:16:35.092351 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z8f9v-config-ldbvj" event={"ID":"a790ce30-95a9-48e9-b492-be83a63251f2","Type":"ContainerDied","Data":"761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b"} Oct 07 13:16:35 crc kubenswrapper[4959]: I1007 13:16:35.092399 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761dd3e6da7a24e30b20495c10a4fc63968da3b4139239d5be4ceeb75b5ed16b" Oct 07 13:16:35 crc kubenswrapper[4959]: I1007 13:16:35.092406 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z8f9v-config-ldbvj" Oct 07 13:16:35 crc kubenswrapper[4959]: I1007 13:16:35.763175 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-z8f9v-config-ldbvj"] Oct 07 13:16:35 crc kubenswrapper[4959]: I1007 13:16:35.768735 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-z8f9v-config-ldbvj"] Oct 07 13:16:36 crc kubenswrapper[4959]: I1007 13:16:36.101388 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-45zn7" event={"ID":"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4","Type":"ContainerStarted","Data":"1edefd15b57f4239069b4c50373d3b1bb37ae8f3393f1d88660a103bcea27a79"} Oct 07 13:16:36 crc kubenswrapper[4959]: I1007 13:16:36.122892 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-45zn7" podStartSLOduration=2.830652304 podStartE2EDuration="15.122872813s" podCreationTimestamp="2025-10-07 13:16:21 +0000 UTC" firstStartedPulling="2025-10-07 13:16:22.358211628 +0000 UTC m=+934.518934305" lastFinishedPulling="2025-10-07 13:16:34.650432137 +0000 UTC m=+946.811154814" observedRunningTime="2025-10-07 13:16:36.116400713 +0000 UTC m=+948.277123390" watchObservedRunningTime="2025-10-07 13:16:36.122872813 +0000 UTC m=+948.283595490" Oct 07 13:16:36 crc kubenswrapper[4959]: I1007 13:16:36.819874 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a790ce30-95a9-48e9-b492-be83a63251f2" path="/var/lib/kubelet/pods/a790ce30-95a9-48e9-b492-be83a63251f2/volumes" Oct 07 13:16:37 crc kubenswrapper[4959]: I1007 13:16:37.485870 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:16:37 crc kubenswrapper[4959]: I1007 13:16:37.696243 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:16:37 crc kubenswrapper[4959]: I1007 13:16:37.696337 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:16:37 crc kubenswrapper[4959]: I1007 13:16:37.812855 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.531008 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nwskk"] Oct 07 13:16:39 crc kubenswrapper[4959]: E1007 13:16:39.533346 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a790ce30-95a9-48e9-b492-be83a63251f2" containerName="ovn-config" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.533366 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a790ce30-95a9-48e9-b492-be83a63251f2" containerName="ovn-config" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.533582 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a790ce30-95a9-48e9-b492-be83a63251f2" containerName="ovn-config" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.534225 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.541006 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nwskk"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.639181 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqtl\" (UniqueName: \"kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl\") pod \"barbican-db-create-nwskk\" (UID: \"fdfba247-ed13-4e8a-a125-f5b94bef38f6\") " pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.729968 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mr66q"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.730966 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.740340 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqtl\" (UniqueName: \"kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl\") pod \"barbican-db-create-nwskk\" (UID: \"fdfba247-ed13-4e8a-a125-f5b94bef38f6\") " pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.741601 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mr66q"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.764278 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqtl\" (UniqueName: \"kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl\") pod \"barbican-db-create-nwskk\" (UID: \"fdfba247-ed13-4e8a-a125-f5b94bef38f6\") " pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.830726 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-cxjwg"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.831902 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.838152 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cxjwg"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.841587 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2sk\" (UniqueName: \"kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk\") pod \"cinder-db-create-mr66q\" (UID: \"3c2cf091-11f1-43ed-9f57-bb01bc99da1f\") " pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.867241 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.908293 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8phz8"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.909444 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.911987 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.912190 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.912370 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.912494 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6s5kg" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.942880 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.942964 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.943157 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74p9\" (UniqueName: \"kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.943396 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2sk\" (UniqueName: \"kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk\") pod \"cinder-db-create-mr66q\" (UID: \"3c2cf091-11f1-43ed-9f57-bb01bc99da1f\") " pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.943536 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfvq\" (UniqueName: \"kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq\") pod \"neutron-db-create-cxjwg\" (UID: \"18c95289-a526-40de-93ff-f7232cb3bf90\") " pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.960367 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8phz8"] Oct 07 13:16:39 crc kubenswrapper[4959]: I1007 13:16:39.964693 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2sk\" (UniqueName: \"kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk\") pod \"cinder-db-create-mr66q\" (UID: \"3c2cf091-11f1-43ed-9f57-bb01bc99da1f\") " pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.044503 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.044863 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.044925 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74p9\" (UniqueName: \"kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.044992 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfvq\" (UniqueName: \"kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq\") pod \"neutron-db-create-cxjwg\" (UID: \"18c95289-a526-40de-93ff-f7232cb3bf90\") " pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.047885 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.059419 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.064440 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.066311 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfvq\" (UniqueName: \"kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq\") pod \"neutron-db-create-cxjwg\" (UID: \"18c95289-a526-40de-93ff-f7232cb3bf90\") " pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.083196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74p9\" (UniqueName: \"kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9\") pod \"keystone-db-sync-8phz8\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.152063 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.247968 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.314612 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mr66q"] Oct 07 13:16:40 crc kubenswrapper[4959]: W1007 13:16:40.337290 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c2cf091_11f1_43ed_9f57_bb01bc99da1f.slice/crio-d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d WatchSource:0}: Error finding container d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d: Status 404 returned error can't find the container with id d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.395295 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nwskk"] Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.623960 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cxjwg"] Oct 07 13:16:40 crc kubenswrapper[4959]: I1007 13:16:40.745791 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8phz8"] Oct 07 13:16:40 crc kubenswrapper[4959]: W1007 13:16:40.749518 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9b58f4_6d48_4be8_b788_386f6c267440.slice/crio-4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91 WatchSource:0}: Error finding container 4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91: Status 404 returned error can't find the container with id 4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91 Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.149807 4959 generic.go:334] "Generic (PLEG): container finished" podID="3c2cf091-11f1-43ed-9f57-bb01bc99da1f" containerID="b5e9272db5738235bad8f59f8b86398988648469419581dce9e25fede80f2e4b" exitCode=0 Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.150067 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mr66q" event={"ID":"3c2cf091-11f1-43ed-9f57-bb01bc99da1f","Type":"ContainerDied","Data":"b5e9272db5738235bad8f59f8b86398988648469419581dce9e25fede80f2e4b"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.150130 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mr66q" event={"ID":"3c2cf091-11f1-43ed-9f57-bb01bc99da1f","Type":"ContainerStarted","Data":"d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.151412 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8phz8" event={"ID":"fd9b58f4-6d48-4be8-b788-386f6c267440","Type":"ContainerStarted","Data":"4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.152823 4959 generic.go:334] "Generic (PLEG): container finished" podID="fdfba247-ed13-4e8a-a125-f5b94bef38f6" containerID="ae1015e6de0fafa33d796250f5a1054d57387ce329d67215c3e8b718f5d0aebd" exitCode=0 Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.152914 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwskk" event={"ID":"fdfba247-ed13-4e8a-a125-f5b94bef38f6","Type":"ContainerDied","Data":"ae1015e6de0fafa33d796250f5a1054d57387ce329d67215c3e8b718f5d0aebd"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.153072 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwskk" event={"ID":"fdfba247-ed13-4e8a-a125-f5b94bef38f6","Type":"ContainerStarted","Data":"841ab3575981b88c3542483c0d98266ab7930d772f42765b8d8646ff0004cb41"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.154557 4959 generic.go:334] "Generic (PLEG): container finished" podID="18c95289-a526-40de-93ff-f7232cb3bf90" containerID="1f2edbdc0900de2e167bdccde230acd02b205f2c5404e416842254a8334783cd" exitCode=0 Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.154587 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cxjwg" event={"ID":"18c95289-a526-40de-93ff-f7232cb3bf90","Type":"ContainerDied","Data":"1f2edbdc0900de2e167bdccde230acd02b205f2c5404e416842254a8334783cd"} Oct 07 13:16:41 crc kubenswrapper[4959]: I1007 13:16:41.154652 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cxjwg" event={"ID":"18c95289-a526-40de-93ff-f7232cb3bf90","Type":"ContainerStarted","Data":"bf5a300bc7253a0eaa5e51de9dedc269c6263af97501ab2191afcaf04a218e36"} Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.163903 4959 generic.go:334] "Generic (PLEG): container finished" podID="6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" containerID="1edefd15b57f4239069b4c50373d3b1bb37ae8f3393f1d88660a103bcea27a79" exitCode=0 Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.163989 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-45zn7" event={"ID":"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4","Type":"ContainerDied","Data":"1edefd15b57f4239069b4c50373d3b1bb37ae8f3393f1d88660a103bcea27a79"} Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.632255 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.638098 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.649114 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.699095 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpfvq\" (UniqueName: \"kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq\") pod \"18c95289-a526-40de-93ff-f7232cb3bf90\" (UID: \"18c95289-a526-40de-93ff-f7232cb3bf90\") " Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.699216 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w2sk\" (UniqueName: \"kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk\") pod \"3c2cf091-11f1-43ed-9f57-bb01bc99da1f\" (UID: \"3c2cf091-11f1-43ed-9f57-bb01bc99da1f\") " Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.699258 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqtl\" (UniqueName: \"kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl\") pod \"fdfba247-ed13-4e8a-a125-f5b94bef38f6\" (UID: \"fdfba247-ed13-4e8a-a125-f5b94bef38f6\") " Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.704473 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl" (OuterVolumeSpecName: "kube-api-access-cpqtl") pod "fdfba247-ed13-4e8a-a125-f5b94bef38f6" (UID: "fdfba247-ed13-4e8a-a125-f5b94bef38f6"). InnerVolumeSpecName "kube-api-access-cpqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.705793 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq" (OuterVolumeSpecName: "kube-api-access-cpfvq") pod "18c95289-a526-40de-93ff-f7232cb3bf90" (UID: "18c95289-a526-40de-93ff-f7232cb3bf90"). InnerVolumeSpecName "kube-api-access-cpfvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.706105 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk" (OuterVolumeSpecName: "kube-api-access-6w2sk") pod "3c2cf091-11f1-43ed-9f57-bb01bc99da1f" (UID: "3c2cf091-11f1-43ed-9f57-bb01bc99da1f"). InnerVolumeSpecName "kube-api-access-6w2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.800116 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqtl\" (UniqueName: \"kubernetes.io/projected/fdfba247-ed13-4e8a-a125-f5b94bef38f6-kube-api-access-cpqtl\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.800450 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpfvq\" (UniqueName: \"kubernetes.io/projected/18c95289-a526-40de-93ff-f7232cb3bf90-kube-api-access-cpfvq\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:42 crc kubenswrapper[4959]: I1007 13:16:42.800462 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w2sk\" (UniqueName: \"kubernetes.io/projected/3c2cf091-11f1-43ed-9f57-bb01bc99da1f-kube-api-access-6w2sk\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.172281 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cxjwg" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.172759 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cxjwg" event={"ID":"18c95289-a526-40de-93ff-f7232cb3bf90","Type":"ContainerDied","Data":"bf5a300bc7253a0eaa5e51de9dedc269c6263af97501ab2191afcaf04a218e36"} Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.172799 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf5a300bc7253a0eaa5e51de9dedc269c6263af97501ab2191afcaf04a218e36" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.175192 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mr66q" event={"ID":"3c2cf091-11f1-43ed-9f57-bb01bc99da1f","Type":"ContainerDied","Data":"d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d"} Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.175243 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mr66q" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.175245 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d384d1f982f93af4b4fb50c04420546407fe059c041b8269bea3496b7a30935d" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.178199 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwskk" Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.178598 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwskk" event={"ID":"fdfba247-ed13-4e8a-a125-f5b94bef38f6","Type":"ContainerDied","Data":"841ab3575981b88c3542483c0d98266ab7930d772f42765b8d8646ff0004cb41"} Oct 07 13:16:43 crc kubenswrapper[4959]: I1007 13:16:43.178651 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841ab3575981b88c3542483c0d98266ab7930d772f42765b8d8646ff0004cb41" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.387753 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.547741 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data\") pod \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.548013 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data\") pod \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.548047 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9snf\" (UniqueName: \"kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf\") pod \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.548122 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle\") pod \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\" (UID: \"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4\") " Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.551441 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" (UID: "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.552354 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf" (OuterVolumeSpecName: "kube-api-access-h9snf") pod "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" (UID: "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4"). InnerVolumeSpecName "kube-api-access-h9snf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.574884 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" (UID: "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.595088 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data" (OuterVolumeSpecName: "config-data") pod "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" (UID: "6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.649525 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.649564 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.649578 4959 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:45 crc kubenswrapper[4959]: I1007 13:16:45.649590 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9snf\" (UniqueName: \"kubernetes.io/projected/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4-kube-api-access-h9snf\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.201047 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-45zn7" event={"ID":"6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4","Type":"ContainerDied","Data":"cae5fe1fab7ad6a0b7fe1920cb63560649c0f311978206e43aa9699fad23584b"} Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.201109 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cae5fe1fab7ad6a0b7fe1920cb63560649c0f311978206e43aa9699fad23584b" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.201197 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-45zn7" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.204149 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8phz8" event={"ID":"fd9b58f4-6d48-4be8-b788-386f6c267440","Type":"ContainerStarted","Data":"16a234b398c31c4095d4a34efd0b605e7ad2c3196abcde28ef2ad08e3f8c57a3"} Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.234842 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8phz8" podStartSLOduration=2.601046808 podStartE2EDuration="7.23481975s" podCreationTimestamp="2025-10-07 13:16:39 +0000 UTC" firstStartedPulling="2025-10-07 13:16:40.751961507 +0000 UTC m=+952.912684184" lastFinishedPulling="2025-10-07 13:16:45.385734419 +0000 UTC m=+957.546457126" observedRunningTime="2025-10-07 13:16:46.219508072 +0000 UTC m=+958.380230789" watchObservedRunningTime="2025-10-07 13:16:46.23481975 +0000 UTC m=+958.395542467" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.766359 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:46 crc kubenswrapper[4959]: E1007 13:16:46.767020 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2cf091-11f1-43ed-9f57-bb01bc99da1f" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767040 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2cf091-11f1-43ed-9f57-bb01bc99da1f" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: E1007 13:16:46.767057 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c95289-a526-40de-93ff-f7232cb3bf90" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767068 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c95289-a526-40de-93ff-f7232cb3bf90" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: E1007 13:16:46.767092 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" containerName="glance-db-sync" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767101 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" containerName="glance-db-sync" Oct 07 13:16:46 crc kubenswrapper[4959]: E1007 13:16:46.767117 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfba247-ed13-4e8a-a125-f5b94bef38f6" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767125 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfba247-ed13-4e8a-a125-f5b94bef38f6" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767301 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfba247-ed13-4e8a-a125-f5b94bef38f6" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767325 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" containerName="glance-db-sync" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767341 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c95289-a526-40de-93ff-f7232cb3bf90" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.767356 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2cf091-11f1-43ed-9f57-bb01bc99da1f" containerName="mariadb-database-create" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.769805 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.828976 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.969676 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.969715 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc99d\" (UniqueName: \"kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.969742 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.969892 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:46 crc kubenswrapper[4959]: I1007 13:16:46.970121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.071669 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.071707 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc99d\" (UniqueName: \"kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.071733 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.071762 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.071826 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.072376 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.072462 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.072972 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.073016 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.089776 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc99d\" (UniqueName: \"kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d\") pod \"dnsmasq-dns-74b7749bc7-wbq7s\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.384075 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:47 crc kubenswrapper[4959]: I1007 13:16:47.840034 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:48 crc kubenswrapper[4959]: I1007 13:16:48.222318 4959 generic.go:334] "Generic (PLEG): container finished" podID="38662688-50da-4805-ad0f-e2569f64e5dd" containerID="fb83c8838737f9c639692a1cfd20662de884dba340df75b2c545bd3924408864" exitCode=0 Oct 07 13:16:48 crc kubenswrapper[4959]: I1007 13:16:48.222366 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" event={"ID":"38662688-50da-4805-ad0f-e2569f64e5dd","Type":"ContainerDied","Data":"fb83c8838737f9c639692a1cfd20662de884dba340df75b2c545bd3924408864"} Oct 07 13:16:48 crc kubenswrapper[4959]: I1007 13:16:48.222606 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" event={"ID":"38662688-50da-4805-ad0f-e2569f64e5dd","Type":"ContainerStarted","Data":"3536b94c7a6a2a3a8db855bb6bfc706bcd98ba57a9c87c4960fb9a59b8482b54"} Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.232751 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" event={"ID":"38662688-50da-4805-ad0f-e2569f64e5dd","Type":"ContainerStarted","Data":"6a58725908e0dcaecbc7cee6280012baa1cf1b597ec257999d10fdf6e8751636"} Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.233212 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.235441 4959 generic.go:334] "Generic (PLEG): container finished" podID="fd9b58f4-6d48-4be8-b788-386f6c267440" containerID="16a234b398c31c4095d4a34efd0b605e7ad2c3196abcde28ef2ad08e3f8c57a3" exitCode=0 Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.235498 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8phz8" event={"ID":"fd9b58f4-6d48-4be8-b788-386f6c267440","Type":"ContainerDied","Data":"16a234b398c31c4095d4a34efd0b605e7ad2c3196abcde28ef2ad08e3f8c57a3"} Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.252598 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" podStartSLOduration=3.252580144 podStartE2EDuration="3.252580144s" podCreationTimestamp="2025-10-07 13:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:16:49.250835563 +0000 UTC m=+961.411558260" watchObservedRunningTime="2025-10-07 13:16:49.252580144 +0000 UTC m=+961.413302831" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.695898 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4df3-account-create-g4qsz"] Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.696907 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.699887 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.705510 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4df3-account-create-g4qsz"] Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.733482 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzpq\" (UniqueName: \"kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq\") pod \"cinder-4df3-account-create-g4qsz\" (UID: \"d01545ad-9354-4300-b539-c48ec9ff1862\") " pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.803804 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-44fa-account-create-wvt5v"] Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.804928 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.807384 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.829911 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-44fa-account-create-wvt5v"] Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.834741 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtl6f\" (UniqueName: \"kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f\") pod \"barbican-44fa-account-create-wvt5v\" (UID: \"38640b66-4901-479c-ade1-65fe23e63db6\") " pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.834886 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzpq\" (UniqueName: \"kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq\") pod \"cinder-4df3-account-create-g4qsz\" (UID: \"d01545ad-9354-4300-b539-c48ec9ff1862\") " pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.851911 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzpq\" (UniqueName: \"kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq\") pod \"cinder-4df3-account-create-g4qsz\" (UID: \"d01545ad-9354-4300-b539-c48ec9ff1862\") " pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.936441 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtl6f\" (UniqueName: \"kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f\") pod \"barbican-44fa-account-create-wvt5v\" (UID: \"38640b66-4901-479c-ade1-65fe23e63db6\") " pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:49 crc kubenswrapper[4959]: I1007 13:16:49.952650 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtl6f\" (UniqueName: \"kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f\") pod \"barbican-44fa-account-create-wvt5v\" (UID: \"38640b66-4901-479c-ade1-65fe23e63db6\") " pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.024265 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.101265 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c298-account-create-qkt7r"] Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.102509 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.104063 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.113025 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c298-account-create-qkt7r"] Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.129974 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.139738 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hnsc\" (UniqueName: \"kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc\") pod \"neutron-c298-account-create-qkt7r\" (UID: \"90a6fe32-b9be-4678-acc7-9966256aa15d\") " pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.241615 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hnsc\" (UniqueName: \"kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc\") pod \"neutron-c298-account-create-qkt7r\" (UID: \"90a6fe32-b9be-4678-acc7-9966256aa15d\") " pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.262584 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hnsc\" (UniqueName: \"kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc\") pod \"neutron-c298-account-create-qkt7r\" (UID: \"90a6fe32-b9be-4678-acc7-9966256aa15d\") " pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.487052 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4df3-account-create-g4qsz"] Oct 07 13:16:50 crc kubenswrapper[4959]: W1007 13:16:50.490893 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd01545ad_9354_4300_b539_c48ec9ff1862.slice/crio-be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed WatchSource:0}: Error finding container be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed: Status 404 returned error can't find the container with id be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.507295 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.530781 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.546441 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data\") pod \"fd9b58f4-6d48-4be8-b788-386f6c267440\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.546552 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle\") pod \"fd9b58f4-6d48-4be8-b788-386f6c267440\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.546598 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74p9\" (UniqueName: \"kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9\") pod \"fd9b58f4-6d48-4be8-b788-386f6c267440\" (UID: \"fd9b58f4-6d48-4be8-b788-386f6c267440\") " Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.551804 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9" (OuterVolumeSpecName: "kube-api-access-c74p9") pod "fd9b58f4-6d48-4be8-b788-386f6c267440" (UID: "fd9b58f4-6d48-4be8-b788-386f6c267440"). InnerVolumeSpecName "kube-api-access-c74p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.594505 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-44fa-account-create-wvt5v"] Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.642398 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd9b58f4-6d48-4be8-b788-386f6c267440" (UID: "fd9b58f4-6d48-4be8-b788-386f6c267440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.649263 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.649296 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74p9\" (UniqueName: \"kubernetes.io/projected/fd9b58f4-6d48-4be8-b788-386f6c267440-kube-api-access-c74p9\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.663120 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data" (OuterVolumeSpecName: "config-data") pod "fd9b58f4-6d48-4be8-b788-386f6c267440" (UID: "fd9b58f4-6d48-4be8-b788-386f6c267440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.750848 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9b58f4-6d48-4be8-b788-386f6c267440-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:50 crc kubenswrapper[4959]: I1007 13:16:50.956530 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c298-account-create-qkt7r"] Oct 07 13:16:50 crc kubenswrapper[4959]: W1007 13:16:50.960654 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a6fe32_b9be_4678_acc7_9966256aa15d.slice/crio-92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc WatchSource:0}: Error finding container 92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc: Status 404 returned error can't find the container with id 92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.261432 4959 generic.go:334] "Generic (PLEG): container finished" podID="38640b66-4901-479c-ade1-65fe23e63db6" containerID="a27bcfc95ac6fade11d6ff4d4c83a2ab22cda6fbc673538c54442d9cc6f8cfcf" exitCode=0 Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.261543 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-44fa-account-create-wvt5v" event={"ID":"38640b66-4901-479c-ade1-65fe23e63db6","Type":"ContainerDied","Data":"a27bcfc95ac6fade11d6ff4d4c83a2ab22cda6fbc673538c54442d9cc6f8cfcf"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.261568 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-44fa-account-create-wvt5v" event={"ID":"38640b66-4901-479c-ade1-65fe23e63db6","Type":"ContainerStarted","Data":"82da029bd3c107f27aa81588a72626a2d2f64d53c7377618b3856e70ba92f1b8"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.263398 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8phz8" event={"ID":"fd9b58f4-6d48-4be8-b788-386f6c267440","Type":"ContainerDied","Data":"4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.263653 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4832691695d42915854923f1660b4773fb802db12f35f99d1cf8171f8c8eee91" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.263602 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8phz8" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.265078 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c298-account-create-qkt7r" event={"ID":"90a6fe32-b9be-4678-acc7-9966256aa15d","Type":"ContainerStarted","Data":"92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.266926 4959 generic.go:334] "Generic (PLEG): container finished" podID="d01545ad-9354-4300-b539-c48ec9ff1862" containerID="ae0ef4eef7f7669587b38db33c20ef12611e18ca5de024ecaefb1de294baecf8" exitCode=0 Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.266954 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4df3-account-create-g4qsz" event={"ID":"d01545ad-9354-4300-b539-c48ec9ff1862","Type":"ContainerDied","Data":"ae0ef4eef7f7669587b38db33c20ef12611e18ca5de024ecaefb1de294baecf8"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.266972 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4df3-account-create-g4qsz" event={"ID":"d01545ad-9354-4300-b539-c48ec9ff1862","Type":"ContainerStarted","Data":"be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed"} Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.532241 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.532585 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="dnsmasq-dns" containerID="cri-o://6a58725908e0dcaecbc7cee6280012baa1cf1b597ec257999d10fdf6e8751636" gracePeriod=10 Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.563705 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ncrlk"] Oct 07 13:16:51 crc kubenswrapper[4959]: E1007 13:16:51.564028 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9b58f4-6d48-4be8-b788-386f6c267440" containerName="keystone-db-sync" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.564042 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9b58f4-6d48-4be8-b788-386f6c267440" containerName="keystone-db-sync" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.564203 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9b58f4-6d48-4be8-b788-386f6c267440" containerName="keystone-db-sync" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.564781 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.567069 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6s5kg" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.567162 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.567247 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.567375 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.594824 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ncrlk"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.601248 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.604091 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.616978 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666417 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666495 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm47d\" (UniqueName: \"kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666522 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666580 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666619 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.666652 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.717882 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.719227 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.726106 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.726239 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-hjdsx" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.726384 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.726779 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.745211 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.767811 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.767881 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm47d\" (UniqueName: \"kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.767918 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.767951 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.767992 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5tt\" (UniqueName: \"kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768034 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768061 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768137 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768172 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768201 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.768225 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.777579 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.777603 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.778972 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.780024 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.780676 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.795279 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm47d\" (UniqueName: \"kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d\") pod \"keystone-bootstrap-ncrlk\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.816226 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.818219 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.819503 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.821829 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.865133 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.906957 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.912030 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.928218 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5tt\" (UniqueName: \"kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935554 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935634 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935674 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935751 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czz6q\" (UniqueName: \"kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935855 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935886 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935945 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935970 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.935997 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.936877 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.937764 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.942827 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.967541 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.969312 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5tt\" (UniqueName: \"kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt\") pod \"dnsmasq-dns-67bcfd764f-4v2tk\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.982617 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:51 crc kubenswrapper[4959]: I1007 13:16:51.983390 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.004086 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.012894 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-vcm4z"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.014382 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.018922 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.019171 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gcrq" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.019279 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.021351 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.022808 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.029695 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vcm4z"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039614 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039691 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039725 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czz6q\" (UniqueName: \"kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039747 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039799 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039820 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039845 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039863 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039881 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039899 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039932 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmjs\" (UniqueName: \"kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.039948 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.040295 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.041525 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.043140 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.044011 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.063718 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czz6q\" (UniqueName: \"kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.068781 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key\") pod \"horizon-6d9d976f-mvq6h\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.144673 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145035 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmjs\" (UniqueName: \"kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145065 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lc2l\" (UniqueName: \"kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145090 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145115 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvj2\" (UniqueName: \"kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145138 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145166 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145189 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145212 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb6sn\" (UniqueName: \"kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145264 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145293 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145325 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.145901 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146115 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146198 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146224 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146260 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146316 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146379 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146403 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146425 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146451 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.146470 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.147432 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.155148 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.155155 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.156070 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.157248 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.160425 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmjs\" (UniqueName: \"kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs\") pod \"ceilometer-0\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.192234 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.192746 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249529 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lc2l\" (UniqueName: \"kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249580 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvj2\" (UniqueName: \"kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249608 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249654 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249687 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb6sn\" (UniqueName: \"kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249742 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249770 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249807 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249853 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249876 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249899 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249939 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249961 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.249982 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.250023 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.250409 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.251268 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.251618 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.252140 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.252417 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.252534 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.254878 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.257217 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.258214 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.262913 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.268866 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.274563 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvj2\" (UniqueName: \"kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2\") pod \"dnsmasq-dns-7b99bccc6c-wnttr\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.274773 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.275204 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb6sn\" (UniqueName: \"kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn\") pod \"horizon-55884ff9b9-x27dr\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.278817 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lc2l\" (UniqueName: \"kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l\") pod \"placement-db-sync-vcm4z\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.284383 4959 generic.go:334] "Generic (PLEG): container finished" podID="90a6fe32-b9be-4678-acc7-9966256aa15d" containerID="f6564aeab92facf5badef9144c1a43b87bb55b0e6c8516f790beab8b34dabda3" exitCode=0 Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.284453 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c298-account-create-qkt7r" event={"ID":"90a6fe32-b9be-4678-acc7-9966256aa15d","Type":"ContainerDied","Data":"f6564aeab92facf5badef9144c1a43b87bb55b0e6c8516f790beab8b34dabda3"} Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.294339 4959 generic.go:334] "Generic (PLEG): container finished" podID="38662688-50da-4805-ad0f-e2569f64e5dd" containerID="6a58725908e0dcaecbc7cee6280012baa1cf1b597ec257999d10fdf6e8751636" exitCode=0 Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.294578 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" event={"ID":"38662688-50da-4805-ad0f-e2569f64e5dd","Type":"ContainerDied","Data":"6a58725908e0dcaecbc7cee6280012baa1cf1b597ec257999d10fdf6e8751636"} Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.506208 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vcm4z" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.517132 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.544027 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.718704 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.724836 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.748660 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ncrlk"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.757339 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.853830 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:16:52 crc kubenswrapper[4959]: W1007 13:16:52.861331 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cbb164d_72db_4115_a2b4_e4f2beef4afd.slice/crio-6e0f3742625c68f760a9726285f91811b4982b1399f03830c29e9ea4fa8bcbb4 WatchSource:0}: Error finding container 6e0f3742625c68f760a9726285f91811b4982b1399f03830c29e9ea4fa8bcbb4: Status 404 returned error can't find the container with id 6e0f3742625c68f760a9726285f91811b4982b1399f03830c29e9ea4fa8bcbb4 Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861490 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtl6f\" (UniqueName: \"kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f\") pod \"38640b66-4901-479c-ade1-65fe23e63db6\" (UID: \"38640b66-4901-479c-ade1-65fe23e63db6\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861586 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb\") pod \"38662688-50da-4805-ad0f-e2569f64e5dd\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861648 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb\") pod \"38662688-50da-4805-ad0f-e2569f64e5dd\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861723 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc99d\" (UniqueName: \"kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d\") pod \"38662688-50da-4805-ad0f-e2569f64e5dd\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861756 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc\") pod \"38662688-50da-4805-ad0f-e2569f64e5dd\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.861826 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config\") pod \"38662688-50da-4805-ad0f-e2569f64e5dd\" (UID: \"38662688-50da-4805-ad0f-e2569f64e5dd\") " Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.867911 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.868541 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d" (OuterVolumeSpecName: "kube-api-access-rc99d") pod "38662688-50da-4805-ad0f-e2569f64e5dd" (UID: "38662688-50da-4805-ad0f-e2569f64e5dd"). InnerVolumeSpecName "kube-api-access-rc99d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.873809 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f" (OuterVolumeSpecName: "kube-api-access-gtl6f") pod "38640b66-4901-479c-ade1-65fe23e63db6" (UID: "38640b66-4901-479c-ade1-65fe23e63db6"). InnerVolumeSpecName "kube-api-access-gtl6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.912580 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.918229 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38662688-50da-4805-ad0f-e2569f64e5dd" (UID: "38662688-50da-4805-ad0f-e2569f64e5dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.940325 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38662688-50da-4805-ad0f-e2569f64e5dd" (UID: "38662688-50da-4805-ad0f-e2569f64e5dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.941185 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38662688-50da-4805-ad0f-e2569f64e5dd" (UID: "38662688-50da-4805-ad0f-e2569f64e5dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.964538 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtl6f\" (UniqueName: \"kubernetes.io/projected/38640b66-4901-479c-ade1-65fe23e63db6-kube-api-access-gtl6f\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.964562 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.964572 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.964582 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc99d\" (UniqueName: \"kubernetes.io/projected/38662688-50da-4805-ad0f-e2569f64e5dd-kube-api-access-rc99d\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.964591 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:52 crc kubenswrapper[4959]: I1007 13:16:52.977217 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config" (OuterVolumeSpecName: "config") pod "38662688-50da-4805-ad0f-e2569f64e5dd" (UID: "38662688-50da-4805-ad0f-e2569f64e5dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.066175 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtzpq\" (UniqueName: \"kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq\") pod \"d01545ad-9354-4300-b539-c48ec9ff1862\" (UID: \"d01545ad-9354-4300-b539-c48ec9ff1862\") " Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.066538 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38662688-50da-4805-ad0f-e2569f64e5dd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.071096 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq" (OuterVolumeSpecName: "kube-api-access-mtzpq") pod "d01545ad-9354-4300-b539-c48ec9ff1862" (UID: "d01545ad-9354-4300-b539-c48ec9ff1862"). InnerVolumeSpecName "kube-api-access-mtzpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:53 crc kubenswrapper[4959]: W1007 13:16:53.113242 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1504679e_c96c_4491_9c41_fd003beb5296.slice/crio-5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a WatchSource:0}: Error finding container 5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a: Status 404 returned error can't find the container with id 5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.114442 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vcm4z"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.129895 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.149196 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.168647 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtzpq\" (UniqueName: \"kubernetes.io/projected/d01545ad-9354-4300-b539-c48ec9ff1862-kube-api-access-mtzpq\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.303466 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-44fa-account-create-wvt5v" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.303441 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-44fa-account-create-wvt5v" event={"ID":"38640b66-4901-479c-ade1-65fe23e63db6","Type":"ContainerDied","Data":"82da029bd3c107f27aa81588a72626a2d2f64d53c7377618b3856e70ba92f1b8"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.304029 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82da029bd3c107f27aa81588a72626a2d2f64d53c7377618b3856e70ba92f1b8" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.306910 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerStarted","Data":"6e0f3742625c68f760a9726285f91811b4982b1399f03830c29e9ea4fa8bcbb4"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.308066 4959 generic.go:334] "Generic (PLEG): container finished" podID="9f42231e-bd9a-4672-8f6d-36f7297f5b0b" containerID="37f120390209f23c3941ae3f9baeee5e988b38e4349587fb8143d943ef8ff130" exitCode=0 Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.308112 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" event={"ID":"9f42231e-bd9a-4672-8f6d-36f7297f5b0b","Type":"ContainerDied","Data":"37f120390209f23c3941ae3f9baeee5e988b38e4349587fb8143d943ef8ff130"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.308129 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" event={"ID":"9f42231e-bd9a-4672-8f6d-36f7297f5b0b","Type":"ContainerStarted","Data":"fc423866f2f46637b2062e986042d3b95a1178110051e301a819db3ba33daa01"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.310710 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vcm4z" event={"ID":"1504679e-c96c-4491-9c41-fd003beb5296","Type":"ContainerStarted","Data":"5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.312596 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncrlk" event={"ID":"79400832-1960-4b26-95f4-1de395d3b6b7","Type":"ContainerStarted","Data":"9261500809d59f98a3afbef16a81008c4c95464b1784353631189e0e1e7dda9d"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.312660 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncrlk" event={"ID":"79400832-1960-4b26-95f4-1de395d3b6b7","Type":"ContainerStarted","Data":"18accea2634a8110a89d0e5ee5f62c6583787fce1dad148341e7996442532522"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.314140 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" event={"ID":"38662688-50da-4805-ad0f-e2569f64e5dd","Type":"ContainerDied","Data":"3536b94c7a6a2a3a8db855bb6bfc706bcd98ba57a9c87c4960fb9a59b8482b54"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.314171 4959 scope.go:117] "RemoveContainer" containerID="6a58725908e0dcaecbc7cee6280012baa1cf1b597ec257999d10fdf6e8751636" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.314273 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b7749bc7-wbq7s" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.319288 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerStarted","Data":"3cb2f75c7a7f0adb7f037263ea991b8c053e1bdfa1d031e374efd4b5e046891a"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.324029 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" event={"ID":"05a2196b-b17f-49ae-a7f2-3ad72d0ff043","Type":"ContainerStarted","Data":"f218edea7cc631260dad9745a991b42dbc37b8f4bc8730ff0acbdc4108037586"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.332538 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerStarted","Data":"c55b3dc9c2061fd0a597b93db7104e96e9a65d077cc845b936db2db08d39b14f"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.341215 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4df3-account-create-g4qsz" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.341359 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4df3-account-create-g4qsz" event={"ID":"d01545ad-9354-4300-b539-c48ec9ff1862","Type":"ContainerDied","Data":"be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed"} Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.341400 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be1b5e5739081004f04b92c5946a2c89972c2092981097395a68a04fa262c6ed" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.344681 4959 scope.go:117] "RemoveContainer" containerID="fb83c8838737f9c639692a1cfd20662de884dba340df75b2c545bd3924408864" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.365013 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ncrlk" podStartSLOduration=2.364992807 podStartE2EDuration="2.364992807s" podCreationTimestamp="2025-10-07 13:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:16:53.359335622 +0000 UTC m=+965.520058299" watchObservedRunningTime="2025-10-07 13:16:53.364992807 +0000 UTC m=+965.525715484" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.416515 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.427583 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74b7749bc7-wbq7s"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.592899 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.632986 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:16:53 crc kubenswrapper[4959]: E1007 13:16:53.633379 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="dnsmasq-dns" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633392 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="dnsmasq-dns" Oct 07 13:16:53 crc kubenswrapper[4959]: E1007 13:16:53.633401 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38640b66-4901-479c-ade1-65fe23e63db6" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633408 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="38640b66-4901-479c-ade1-65fe23e63db6" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: E1007 13:16:53.633441 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01545ad-9354-4300-b539-c48ec9ff1862" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633448 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01545ad-9354-4300-b539-c48ec9ff1862" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: E1007 13:16:53.633458 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="init" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633463 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="init" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633616 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01545ad-9354-4300-b539-c48ec9ff1862" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633645 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="38640b66-4901-479c-ade1-65fe23e63db6" containerName="mariadb-account-create" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.633655 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" containerName="dnsmasq-dns" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.634695 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.639663 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.749417 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.786578 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.786663 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgv7\" (UniqueName: \"kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.786689 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.786747 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.786795 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.831965 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.891334 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hnsc\" (UniqueName: \"kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc\") pod \"90a6fe32-b9be-4678-acc7-9966256aa15d\" (UID: \"90a6fe32-b9be-4678-acc7-9966256aa15d\") " Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.891867 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.891944 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.892000 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.892062 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgv7\" (UniqueName: \"kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.892103 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.894230 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.895729 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.896347 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.900787 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc" (OuterVolumeSpecName: "kube-api-access-5hnsc") pod "90a6fe32-b9be-4678-acc7-9966256aa15d" (UID: "90a6fe32-b9be-4678-acc7-9966256aa15d"). InnerVolumeSpecName "kube-api-access-5hnsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.907164 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.917238 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgv7\" (UniqueName: \"kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7\") pod \"horizon-f68678c59-7z9rj\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.963006 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:16:53 crc kubenswrapper[4959]: I1007 13:16:53.993655 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hnsc\" (UniqueName: \"kubernetes.io/projected/90a6fe32-b9be-4678-acc7-9966256aa15d-kube-api-access-5hnsc\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.004156 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.095593 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb\") pod \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.095783 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc\") pod \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.095815 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb\") pod \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.095919 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w5tt\" (UniqueName: \"kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt\") pod \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.096006 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config\") pod \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\" (UID: \"9f42231e-bd9a-4672-8f6d-36f7297f5b0b\") " Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.119043 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f42231e-bd9a-4672-8f6d-36f7297f5b0b" (UID: "9f42231e-bd9a-4672-8f6d-36f7297f5b0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.119190 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config" (OuterVolumeSpecName: "config") pod "9f42231e-bd9a-4672-8f6d-36f7297f5b0b" (UID: "9f42231e-bd9a-4672-8f6d-36f7297f5b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.125870 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt" (OuterVolumeSpecName: "kube-api-access-2w5tt") pod "9f42231e-bd9a-4672-8f6d-36f7297f5b0b" (UID: "9f42231e-bd9a-4672-8f6d-36f7297f5b0b"). InnerVolumeSpecName "kube-api-access-2w5tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.150913 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f42231e-bd9a-4672-8f6d-36f7297f5b0b" (UID: "9f42231e-bd9a-4672-8f6d-36f7297f5b0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.153748 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f42231e-bd9a-4672-8f6d-36f7297f5b0b" (UID: "9f42231e-bd9a-4672-8f6d-36f7297f5b0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.197931 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.197970 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.197983 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.198001 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w5tt\" (UniqueName: \"kubernetes.io/projected/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-kube-api-access-2w5tt\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.198014 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f42231e-bd9a-4672-8f6d-36f7297f5b0b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.353666 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c298-account-create-qkt7r" event={"ID":"90a6fe32-b9be-4678-acc7-9966256aa15d","Type":"ContainerDied","Data":"92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc"} Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.353712 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c298-account-create-qkt7r" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.353715 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ac18997e0a6b03cc5be8fd36afed853913c50a78cf9835079218a787dc03cc" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.357237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" event={"ID":"9f42231e-bd9a-4672-8f6d-36f7297f5b0b","Type":"ContainerDied","Data":"fc423866f2f46637b2062e986042d3b95a1178110051e301a819db3ba33daa01"} Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.357308 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bcfd764f-4v2tk" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.357559 4959 scope.go:117] "RemoveContainer" containerID="37f120390209f23c3941ae3f9baeee5e988b38e4349587fb8143d943ef8ff130" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.360476 4959 generic.go:334] "Generic (PLEG): container finished" podID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerID="606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46" exitCode=0 Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.361859 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" event={"ID":"05a2196b-b17f-49ae-a7f2-3ad72d0ff043","Type":"ContainerDied","Data":"606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46"} Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.444795 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.459155 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bcfd764f-4v2tk"] Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.513061 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:16:54 crc kubenswrapper[4959]: W1007 13:16:54.514684 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e0751a_3aa2_40ec_87c6_d61c6205ff61.slice/crio-30ef9eccd85a6a30c811ff70ec8b373db7ede70bb6052b7f13ab76a5adf09db4 WatchSource:0}: Error finding container 30ef9eccd85a6a30c811ff70ec8b373db7ede70bb6052b7f13ab76a5adf09db4: Status 404 returned error can't find the container with id 30ef9eccd85a6a30c811ff70ec8b373db7ede70bb6052b7f13ab76a5adf09db4 Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.827377 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38662688-50da-4805-ad0f-e2569f64e5dd" path="/var/lib/kubelet/pods/38662688-50da-4805-ad0f-e2569f64e5dd/volumes" Oct 07 13:16:54 crc kubenswrapper[4959]: I1007 13:16:54.828148 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f42231e-bd9a-4672-8f6d-36f7297f5b0b" path="/var/lib/kubelet/pods/9f42231e-bd9a-4672-8f6d-36f7297f5b0b/volumes" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.044314 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qkdbw"] Oct 07 13:16:55 crc kubenswrapper[4959]: E1007 13:16:55.044902 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f42231e-bd9a-4672-8f6d-36f7297f5b0b" containerName="init" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.044919 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f42231e-bd9a-4672-8f6d-36f7297f5b0b" containerName="init" Oct 07 13:16:55 crc kubenswrapper[4959]: E1007 13:16:55.044934 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a6fe32-b9be-4678-acc7-9966256aa15d" containerName="mariadb-account-create" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.044941 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a6fe32-b9be-4678-acc7-9966256aa15d" containerName="mariadb-account-create" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.045146 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f42231e-bd9a-4672-8f6d-36f7297f5b0b" containerName="init" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.045168 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a6fe32-b9be-4678-acc7-9966256aa15d" containerName="mariadb-account-create" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.045802 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.050937 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rhk7w" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.051152 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.051942 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.060052 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qkdbw"] Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.117055 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kcfg6"] Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.120204 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125100 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125140 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzw42\" (UniqueName: \"kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125190 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125213 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125234 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.125342 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kcfg6"] Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.128161 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gk68v" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.128363 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227339 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227392 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227439 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227475 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227512 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltdrg\" (UniqueName: \"kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227550 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227790 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.227901 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.228079 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzw42\" (UniqueName: \"kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.228303 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.232398 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.233021 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.244306 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.246425 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.251403 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzw42\" (UniqueName: \"kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42\") pod \"cinder-db-sync-qkdbw\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.331174 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.331233 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltdrg\" (UniqueName: \"kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.331270 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.336532 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.336745 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.349200 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltdrg\" (UniqueName: \"kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg\") pod \"barbican-db-sync-kcfg6\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.378442 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.379131 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5j2t5"] Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.392093 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5j2t5"] Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.392202 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.397130 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.397214 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5sv6w" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.397557 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.400222 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerStarted","Data":"30ef9eccd85a6a30c811ff70ec8b373db7ede70bb6052b7f13ab76a5adf09db4"} Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.409429 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" event={"ID":"05a2196b-b17f-49ae-a7f2-3ad72d0ff043","Type":"ContainerStarted","Data":"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc"} Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.410111 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.448668 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.535295 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.535703 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.535793 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dvm\" (UniqueName: \"kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.637852 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dvm\" (UniqueName: \"kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.637998 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.638027 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.644166 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.653537 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.654773 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dvm\" (UniqueName: \"kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm\") pod \"neutron-db-sync-5j2t5\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.752677 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.847496 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" podStartSLOduration=4.847481295 podStartE2EDuration="4.847481295s" podCreationTimestamp="2025-10-07 13:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:16:55.429004217 +0000 UTC m=+967.589726894" watchObservedRunningTime="2025-10-07 13:16:55.847481295 +0000 UTC m=+968.008203972" Oct 07 13:16:55 crc kubenswrapper[4959]: I1007 13:16:55.853018 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qkdbw"] Oct 07 13:16:56 crc kubenswrapper[4959]: I1007 13:16:56.031106 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kcfg6"] Oct 07 13:16:56 crc kubenswrapper[4959]: I1007 13:16:56.425728 4959 generic.go:334] "Generic (PLEG): container finished" podID="79400832-1960-4b26-95f4-1de395d3b6b7" containerID="9261500809d59f98a3afbef16a81008c4c95464b1784353631189e0e1e7dda9d" exitCode=0 Oct 07 13:16:56 crc kubenswrapper[4959]: I1007 13:16:56.425787 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncrlk" event={"ID":"79400832-1960-4b26-95f4-1de395d3b6b7","Type":"ContainerDied","Data":"9261500809d59f98a3afbef16a81008c4c95464b1784353631189e0e1e7dda9d"} Oct 07 13:16:59 crc kubenswrapper[4959]: W1007 13:16:59.845128 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd267601_4074_4cfb_8b40_8cd5fa12917c.slice/crio-225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a WatchSource:0}: Error finding container 225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a: Status 404 returned error can't find the container with id 225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a Oct 07 13:16:59 crc kubenswrapper[4959]: I1007 13:16:59.927689 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017340 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017466 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm47d\" (UniqueName: \"kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017503 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017536 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017556 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.017588 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys\") pod \"79400832-1960-4b26-95f4-1de395d3b6b7\" (UID: \"79400832-1960-4b26-95f4-1de395d3b6b7\") " Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.024301 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts" (OuterVolumeSpecName: "scripts") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.024864 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.024897 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d" (OuterVolumeSpecName: "kube-api-access-fm47d") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "kube-api-access-fm47d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.027154 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.050106 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.050680 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data" (OuterVolumeSpecName: "config-data") pod "79400832-1960-4b26-95f4-1de395d3b6b7" (UID: "79400832-1960-4b26-95f4-1de395d3b6b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120090 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120130 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120145 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120158 4959 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120170 4959 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79400832-1960-4b26-95f4-1de395d3b6b7-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.120181 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm47d\" (UniqueName: \"kubernetes.io/projected/79400832-1960-4b26-95f4-1de395d3b6b7-kube-api-access-fm47d\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.187423 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.223793 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:17:00 crc kubenswrapper[4959]: E1007 13:17:00.226298 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79400832-1960-4b26-95f4-1de395d3b6b7" containerName="keystone-bootstrap" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.226319 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="79400832-1960-4b26-95f4-1de395d3b6b7" containerName="keystone-bootstrap" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.226578 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="79400832-1960-4b26-95f4-1de395d3b6b7" containerName="keystone-bootstrap" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.230950 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.235484 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.251640 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.323225 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324266 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324484 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324523 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324749 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324805 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324830 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbk6\" (UniqueName: \"kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.324852 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.352383 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dc68dfcf6-xkrw7"] Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.353763 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.367563 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc68dfcf6-xkrw7"] Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426749 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426815 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwz2\" (UniqueName: \"kubernetes.io/projected/41b4db91-ead3-4028-b30c-e3e726ae6f1e-kube-api-access-bjwz2\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426838 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-config-data\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426860 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-combined-ca-bundle\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426890 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426911 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426949 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-tls-certs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.426978 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-scripts\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427009 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427042 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427061 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbbk6\" (UniqueName: \"kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427076 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b4db91-ead3-4028-b30c-e3e726ae6f1e-logs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427093 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427122 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-secret-key\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.427817 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.428347 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.428471 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.431516 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.431765 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.434645 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.444328 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbbk6\" (UniqueName: \"kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6\") pod \"horizon-6c99fbb6b6-2j7rt\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.463603 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncrlk" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.463595 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncrlk" event={"ID":"79400832-1960-4b26-95f4-1de395d3b6b7","Type":"ContainerDied","Data":"18accea2634a8110a89d0e5ee5f62c6583787fce1dad148341e7996442532522"} Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.463754 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18accea2634a8110a89d0e5ee5f62c6583787fce1dad148341e7996442532522" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.466460 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkdbw" event={"ID":"bd267601-4074-4cfb-8b40-8cd5fa12917c","Type":"ContainerStarted","Data":"225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a"} Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528246 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-combined-ca-bundle\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528327 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-tls-certs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528361 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-scripts\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528465 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b4db91-ead3-4028-b30c-e3e726ae6f1e-logs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528496 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-secret-key\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528538 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwz2\" (UniqueName: \"kubernetes.io/projected/41b4db91-ead3-4028-b30c-e3e726ae6f1e-kube-api-access-bjwz2\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.528556 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-config-data\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.531832 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b4db91-ead3-4028-b30c-e3e726ae6f1e-logs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.532396 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-combined-ca-bundle\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.532656 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-scripts\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.532829 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41b4db91-ead3-4028-b30c-e3e726ae6f1e-config-data\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.533155 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-tls-certs\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.534394 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41b4db91-ead3-4028-b30c-e3e726ae6f1e-horizon-secret-key\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.546158 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwz2\" (UniqueName: \"kubernetes.io/projected/41b4db91-ead3-4028-b30c-e3e726ae6f1e-kube-api-access-bjwz2\") pod \"horizon-dc68dfcf6-xkrw7\" (UID: \"41b4db91-ead3-4028-b30c-e3e726ae6f1e\") " pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.558528 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:00 crc kubenswrapper[4959]: I1007 13:17:00.672477 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.038020 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ncrlk"] Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.045710 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ncrlk"] Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.138016 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-strcc"] Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.141927 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.144803 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.145225 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6s5kg" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.145232 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.145287 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.156859 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-strcc"] Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.239912 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.239971 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.240077 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhjx\" (UniqueName: \"kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.240121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.240164 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.240182 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.343976 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhjx\" (UniqueName: \"kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.344343 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.345377 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.345425 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.346523 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.346578 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.356695 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.357016 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.359834 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.364261 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.373571 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhjx\" (UniqueName: \"kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.373594 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data\") pod \"keystone-bootstrap-strcc\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:01 crc kubenswrapper[4959]: I1007 13:17:01.487955 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:02 crc kubenswrapper[4959]: I1007 13:17:02.519462 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:17:02 crc kubenswrapper[4959]: I1007 13:17:02.607192 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:17:02 crc kubenswrapper[4959]: I1007 13:17:02.607464 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="dnsmasq-dns" containerID="cri-o://bd66846d6a2fe57e5218379116787a21a12ddf9bbbc1d6cce6e5ee567e72e0af" gracePeriod=10 Oct 07 13:17:02 crc kubenswrapper[4959]: I1007 13:17:02.826440 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79400832-1960-4b26-95f4-1de395d3b6b7" path="/var/lib/kubelet/pods/79400832-1960-4b26-95f4-1de395d3b6b7/volumes" Oct 07 13:17:03 crc kubenswrapper[4959]: I1007 13:17:03.495022 4959 generic.go:334] "Generic (PLEG): container finished" podID="cacf79ac-7a79-42e7-83c7-169654627df8" containerID="bd66846d6a2fe57e5218379116787a21a12ddf9bbbc1d6cce6e5ee567e72e0af" exitCode=0 Oct 07 13:17:03 crc kubenswrapper[4959]: I1007 13:17:03.495064 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" event={"ID":"cacf79ac-7a79-42e7-83c7-169654627df8","Type":"ContainerDied","Data":"bd66846d6a2fe57e5218379116787a21a12ddf9bbbc1d6cce6e5ee567e72e0af"} Oct 07 13:17:06 crc kubenswrapper[4959]: I1007 13:17:06.518585 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kcfg6" event={"ID":"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d","Type":"ContainerStarted","Data":"4fbf597293e122660b6199d5a377de6267de4e633e3812ed5571a611310158b1"} Oct 07 13:17:07 crc kubenswrapper[4959]: E1007 13:17:07.433818 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032" Oct 07 13:17:07 crc kubenswrapper[4959]: E1007 13:17:07.434266 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n544h67dh5fh689h664hbh5dbh8bh5bdh5bh5cchc5h588h564h56ch8ch5fdh584h96h65dh59ch5cch88h57bh687hchbdh59bh74h54dh5ffh546q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgmjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c6955960-fcc4-4d43-9774-fafd72ee3569): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 13:17:07 crc kubenswrapper[4959]: I1007 13:17:07.529860 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" event={"ID":"cacf79ac-7a79-42e7-83c7-169654627df8","Type":"ContainerDied","Data":"9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf"} Oct 07 13:17:07 crc kubenswrapper[4959]: I1007 13:17:07.529909 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6852af39620c4e74730d5f54990c2b3b515d56b08795df01b3f34640515fdf" Oct 07 13:17:07 crc kubenswrapper[4959]: I1007 13:17:07.563388 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.651669 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config\") pod \"cacf79ac-7a79-42e7-83c7-169654627df8\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.652111 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msfz\" (UniqueName: \"kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz\") pod \"cacf79ac-7a79-42e7-83c7-169654627df8\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.652202 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb\") pod \"cacf79ac-7a79-42e7-83c7-169654627df8\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.652238 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb\") pod \"cacf79ac-7a79-42e7-83c7-169654627df8\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.652288 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc\") pod \"cacf79ac-7a79-42e7-83c7-169654627df8\" (UID: \"cacf79ac-7a79-42e7-83c7-169654627df8\") " Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.657443 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz" (OuterVolumeSpecName: "kube-api-access-5msfz") pod "cacf79ac-7a79-42e7-83c7-169654627df8" (UID: "cacf79ac-7a79-42e7-83c7-169654627df8"). InnerVolumeSpecName "kube-api-access-5msfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.696199 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.696260 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.696303 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.696765 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.696819 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1" gracePeriod=600 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.704470 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config" (OuterVolumeSpecName: "config") pod "cacf79ac-7a79-42e7-83c7-169654627df8" (UID: "cacf79ac-7a79-42e7-83c7-169654627df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.723619 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cacf79ac-7a79-42e7-83c7-169654627df8" (UID: "cacf79ac-7a79-42e7-83c7-169654627df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.746802 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cacf79ac-7a79-42e7-83c7-169654627df8" (UID: "cacf79ac-7a79-42e7-83c7-169654627df8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.754123 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.754156 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.754170 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msfz\" (UniqueName: \"kubernetes.io/projected/cacf79ac-7a79-42e7-83c7-169654627df8-kube-api-access-5msfz\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.754183 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.758150 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cacf79ac-7a79-42e7-83c7-169654627df8" (UID: "cacf79ac-7a79-42e7-83c7-169654627df8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:07.855796 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cacf79ac-7a79-42e7-83c7-169654627df8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.544239 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vcm4z" event={"ID":"1504679e-c96c-4491-9c41-fd003beb5296","Type":"ContainerStarted","Data":"571142fd1b7a88f5f0f8323aed6243b717f68683c07421e58211dfd7cf77865e"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.547287 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerStarted","Data":"dab9b30874dfb18d1fb10a6a3f95f50922608f45ccc911b646b7f9f25a256a97"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.547333 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerStarted","Data":"bf00157c74bc07a894f3a25bad225aa723be81199730cb3659d8c902e6a5c287"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.547381 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d9d976f-mvq6h" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon-log" containerID="cri-o://bf00157c74bc07a894f3a25bad225aa723be81199730cb3659d8c902e6a5c287" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.547408 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d9d976f-mvq6h" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon" containerID="cri-o://dab9b30874dfb18d1fb10a6a3f95f50922608f45ccc911b646b7f9f25a256a97" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.556048 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1" exitCode=0 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.556114 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.556148 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.556171 4959 scope.go:117] "RemoveContainer" containerID="ef53d2923cca70810fb795e5b43b9166268df0aa973b6ab2fa0bc61b8de8c8ee" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.561298 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerStarted","Data":"0387d868afea277af48228e13a937a66781088cc91bbbc889aa8139071adb0dc"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.561348 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerStarted","Data":"27a487aeb712f4d6a99f0594e41aa03d699e1f30a738134fea1c143b94f60480"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.561512 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f68678c59-7z9rj" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon-log" containerID="cri-o://27a487aeb712f4d6a99f0594e41aa03d699e1f30a738134fea1c143b94f60480" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.561763 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f68678c59-7z9rj" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon" containerID="cri-o://0387d868afea277af48228e13a937a66781088cc91bbbc889aa8139071adb0dc" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.563008 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-vcm4z" podStartSLOduration=3.113786143 podStartE2EDuration="17.562981796s" podCreationTimestamp="2025-10-07 13:16:51 +0000 UTC" firstStartedPulling="2025-10-07 13:16:53.120167601 +0000 UTC m=+965.280890278" lastFinishedPulling="2025-10-07 13:17:07.569363254 +0000 UTC m=+979.730085931" observedRunningTime="2025-10-07 13:17:08.560611907 +0000 UTC m=+980.721334594" watchObservedRunningTime="2025-10-07 13:17:08.562981796 +0000 UTC m=+980.723704473" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.567024 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.567142 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerStarted","Data":"0206f4ab7265a3912f96efab6a3043e939436e74954f04ba99ff1edf6d217dcd"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.567195 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerStarted","Data":"cf4d8edf09aa8c1718091f0156df1e4a6d3ebd531821fdc63fa104e6de5e15b2"} Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.567356 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55884ff9b9-x27dr" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon-log" containerID="cri-o://cf4d8edf09aa8c1718091f0156df1e4a6d3ebd531821fdc63fa104e6de5e15b2" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.567518 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55884ff9b9-x27dr" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon" containerID="cri-o://0206f4ab7265a3912f96efab6a3043e939436e74954f04ba99ff1edf6d217dcd" gracePeriod=30 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.595639 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d9d976f-mvq6h" podStartSLOduration=2.898457771 podStartE2EDuration="17.595608691s" podCreationTimestamp="2025-10-07 13:16:51 +0000 UTC" firstStartedPulling="2025-10-07 13:16:52.872214244 +0000 UTC m=+965.032936921" lastFinishedPulling="2025-10-07 13:17:07.569365164 +0000 UTC m=+979.730087841" observedRunningTime="2025-10-07 13:17:08.594306323 +0000 UTC m=+980.755029020" watchObservedRunningTime="2025-10-07 13:17:08.595608691 +0000 UTC m=+980.756331368" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.667471 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.689202 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-glb82"] Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.699103 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55884ff9b9-x27dr" podStartSLOduration=3.209307148 podStartE2EDuration="17.699081139s" podCreationTimestamp="2025-10-07 13:16:51 +0000 UTC" firstStartedPulling="2025-10-07 13:16:53.158354529 +0000 UTC m=+965.319077206" lastFinishedPulling="2025-10-07 13:17:07.64812852 +0000 UTC m=+979.808851197" observedRunningTime="2025-10-07 13:17:08.632974444 +0000 UTC m=+980.793697121" watchObservedRunningTime="2025-10-07 13:17:08.699081139 +0000 UTC m=+980.859803816" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.706029 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f68678c59-7z9rj" podStartSLOduration=2.591296147 podStartE2EDuration="15.705971391s" podCreationTimestamp="2025-10-07 13:16:53 +0000 UTC" firstStartedPulling="2025-10-07 13:16:54.522893487 +0000 UTC m=+966.683616164" lastFinishedPulling="2025-10-07 13:17:07.637568721 +0000 UTC m=+979.798291408" observedRunningTime="2025-10-07 13:17:08.665308641 +0000 UTC m=+980.826031338" watchObservedRunningTime="2025-10-07 13:17:08.705971391 +0000 UTC m=+980.866694068" Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.718102 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5j2t5"] Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.761552 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-strcc"] Oct 07 13:17:08 crc kubenswrapper[4959]: W1007 13:17:08.787716 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468d7f11_8929_410e_a59d_1f78cc33a279.slice/crio-67ff2c5f24ba6b1134361b471278457ba91b1bdff023542e5b362f748119eea9 WatchSource:0}: Error finding container 67ff2c5f24ba6b1134361b471278457ba91b1bdff023542e5b362f748119eea9: Status 404 returned error can't find the container with id 67ff2c5f24ba6b1134361b471278457ba91b1bdff023542e5b362f748119eea9 Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.789663 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.796442 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc68dfcf6-xkrw7"] Oct 07 13:17:08 crc kubenswrapper[4959]: I1007 13:17:08.840596 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" path="/var/lib/kubelet/pods/cacf79ac-7a79-42e7-83c7-169654627df8/volumes" Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.622820 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5j2t5" event={"ID":"c5722dd5-41d2-40bd-bd65-e57d7567ecf7","Type":"ContainerStarted","Data":"28650878bfd8bce39078769be19fa9e1091264b087e756d0cac0fc335d5b2582"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.623308 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5j2t5" event={"ID":"c5722dd5-41d2-40bd-bd65-e57d7567ecf7","Type":"ContainerStarted","Data":"787ac12f007e24d41521eeb8b0c1440c416155d78003b2d55c3ddd7c97e12fea"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.629356 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc68dfcf6-xkrw7" event={"ID":"41b4db91-ead3-4028-b30c-e3e726ae6f1e","Type":"ContainerStarted","Data":"dd36e2e5be21ae329f310fc0d5a738cbc11dbe211c43f9946fef7afe159931cc"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.630649 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerStarted","Data":"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.630693 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerStarted","Data":"67ff2c5f24ba6b1134361b471278457ba91b1bdff023542e5b362f748119eea9"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.633879 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-strcc" event={"ID":"aaeb69e1-bac6-448b-832e-d6d32a47547a","Type":"ContainerStarted","Data":"a6a116d4990e4e4662b1890b5d66e691092d7c189d2a2eaa6e54657aa34b805e"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.633909 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-strcc" event={"ID":"aaeb69e1-bac6-448b-832e-d6d32a47547a","Type":"ContainerStarted","Data":"d7b1fd80686664260369dbc2f6ba2d64879ee30a9e9eea421d8f6e5b4a7b1e80"} Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.642431 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5j2t5" podStartSLOduration=14.642414138 podStartE2EDuration="14.642414138s" podCreationTimestamp="2025-10-07 13:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:09.63769485 +0000 UTC m=+981.798417517" watchObservedRunningTime="2025-10-07 13:17:09.642414138 +0000 UTC m=+981.803136815" Oct 07 13:17:09 crc kubenswrapper[4959]: I1007 13:17:09.658294 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-strcc" podStartSLOduration=8.658277232 podStartE2EDuration="8.658277232s" podCreationTimestamp="2025-10-07 13:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:09.655979835 +0000 UTC m=+981.816702512" watchObservedRunningTime="2025-10-07 13:17:09.658277232 +0000 UTC m=+981.818999909" Oct 07 13:17:10 crc kubenswrapper[4959]: I1007 13:17:10.651360 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerStarted","Data":"f9b18d1cfd14c4c147769634a5e60537e955aa28e78fb029751696cbae57a53d"} Oct 07 13:17:10 crc kubenswrapper[4959]: I1007 13:17:10.654008 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc68dfcf6-xkrw7" event={"ID":"41b4db91-ead3-4028-b30c-e3e726ae6f1e","Type":"ContainerStarted","Data":"8dd18001471db711755b41a42df3c68a7809a2ff55d7005fb1bf4040d2356b39"} Oct 07 13:17:10 crc kubenswrapper[4959]: I1007 13:17:10.655933 4959 generic.go:334] "Generic (PLEG): container finished" podID="1504679e-c96c-4491-9c41-fd003beb5296" containerID="571142fd1b7a88f5f0f8323aed6243b717f68683c07421e58211dfd7cf77865e" exitCode=0 Oct 07 13:17:10 crc kubenswrapper[4959]: I1007 13:17:10.655991 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vcm4z" event={"ID":"1504679e-c96c-4491-9c41-fd003beb5296","Type":"ContainerDied","Data":"571142fd1b7a88f5f0f8323aed6243b717f68683c07421e58211dfd7cf77865e"} Oct 07 13:17:11 crc kubenswrapper[4959]: I1007 13:17:11.793867 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-dc9d58d7-glb82" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Oct 07 13:17:12 crc kubenswrapper[4959]: I1007 13:17:12.193815 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:17:12 crc kubenswrapper[4959]: I1007 13:17:12.544712 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:17:13 crc kubenswrapper[4959]: I1007 13:17:13.963821 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:17:16 crc kubenswrapper[4959]: I1007 13:17:16.739862 4959 generic.go:334] "Generic (PLEG): container finished" podID="aaeb69e1-bac6-448b-832e-d6d32a47547a" containerID="a6a116d4990e4e4662b1890b5d66e691092d7c189d2a2eaa6e54657aa34b805e" exitCode=0 Oct 07 13:17:16 crc kubenswrapper[4959]: I1007 13:17:16.740326 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-strcc" event={"ID":"aaeb69e1-bac6-448b-832e-d6d32a47547a","Type":"ContainerDied","Data":"a6a116d4990e4e4662b1890b5d66e691092d7c189d2a2eaa6e54657aa34b805e"} Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.021699 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vcm4z" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.173354 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lc2l\" (UniqueName: \"kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l\") pod \"1504679e-c96c-4491-9c41-fd003beb5296\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.173431 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts\") pod \"1504679e-c96c-4491-9c41-fd003beb5296\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.173509 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs\") pod \"1504679e-c96c-4491-9c41-fd003beb5296\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.173644 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data\") pod \"1504679e-c96c-4491-9c41-fd003beb5296\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.173669 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle\") pod \"1504679e-c96c-4491-9c41-fd003beb5296\" (UID: \"1504679e-c96c-4491-9c41-fd003beb5296\") " Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.174207 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs" (OuterVolumeSpecName: "logs") pod "1504679e-c96c-4491-9c41-fd003beb5296" (UID: "1504679e-c96c-4491-9c41-fd003beb5296"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.179096 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts" (OuterVolumeSpecName: "scripts") pod "1504679e-c96c-4491-9c41-fd003beb5296" (UID: "1504679e-c96c-4491-9c41-fd003beb5296"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.197304 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l" (OuterVolumeSpecName: "kube-api-access-5lc2l") pod "1504679e-c96c-4491-9c41-fd003beb5296" (UID: "1504679e-c96c-4491-9c41-fd003beb5296"). InnerVolumeSpecName "kube-api-access-5lc2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.201551 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data" (OuterVolumeSpecName: "config-data") pod "1504679e-c96c-4491-9c41-fd003beb5296" (UID: "1504679e-c96c-4491-9c41-fd003beb5296"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.204839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1504679e-c96c-4491-9c41-fd003beb5296" (UID: "1504679e-c96c-4491-9c41-fd003beb5296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.276131 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.276183 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.276196 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lc2l\" (UniqueName: \"kubernetes.io/projected/1504679e-c96c-4491-9c41-fd003beb5296-kube-api-access-5lc2l\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.276208 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1504679e-c96c-4491-9c41-fd003beb5296-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.276241 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1504679e-c96c-4491-9c41-fd003beb5296-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.770237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vcm4z" event={"ID":"1504679e-c96c-4491-9c41-fd003beb5296","Type":"ContainerDied","Data":"5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a"} Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.770491 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db9cdd6ab5c653b9d857a41da9ceae7376747451338e7eb8aff0a17f7ace62a" Oct 07 13:17:19 crc kubenswrapper[4959]: I1007 13:17:19.770364 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vcm4z" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238125 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58674f758b-wncml"] Oct 07 13:17:20 crc kubenswrapper[4959]: E1007 13:17:20.238528 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="dnsmasq-dns" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238540 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="dnsmasq-dns" Oct 07 13:17:20 crc kubenswrapper[4959]: E1007 13:17:20.238559 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504679e-c96c-4491-9c41-fd003beb5296" containerName="placement-db-sync" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238566 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504679e-c96c-4491-9c41-fd003beb5296" containerName="placement-db-sync" Oct 07 13:17:20 crc kubenswrapper[4959]: E1007 13:17:20.238604 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="init" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238614 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="init" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238861 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504679e-c96c-4491-9c41-fd003beb5296" containerName="placement-db-sync" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.238882 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacf79ac-7a79-42e7-83c7-169654627df8" containerName="dnsmasq-dns" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.241981 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.289723 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.289931 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.289952 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.290034 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gcrq" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.290110 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.298086 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58674f758b-wncml"] Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394686 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwc8v\" (UniqueName: \"kubernetes.io/projected/ef3ca2a1-1eed-47fc-8454-47decce134d5-kube-api-access-jwc8v\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394766 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-config-data\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394838 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-scripts\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394891 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-internal-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394935 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-public-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394957 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-combined-ca-bundle\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.394999 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3ca2a1-1eed-47fc-8454-47decce134d5-logs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.496427 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3ca2a1-1eed-47fc-8454-47decce134d5-logs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.496730 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwc8v\" (UniqueName: \"kubernetes.io/projected/ef3ca2a1-1eed-47fc-8454-47decce134d5-kube-api-access-jwc8v\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.496857 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-config-data\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.496944 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-scripts\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.497041 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-internal-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.497132 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-public-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.496882 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3ca2a1-1eed-47fc-8454-47decce134d5-logs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.497209 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-combined-ca-bundle\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.503277 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-scripts\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.513228 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-public-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.513793 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-combined-ca-bundle\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.514103 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-internal-tls-certs\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.514455 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3ca2a1-1eed-47fc-8454-47decce134d5-config-data\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.533177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwc8v\" (UniqueName: \"kubernetes.io/projected/ef3ca2a1-1eed-47fc-8454-47decce134d5-kube-api-access-jwc8v\") pod \"placement-58674f758b-wncml\" (UID: \"ef3ca2a1-1eed-47fc-8454-47decce134d5\") " pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:20 crc kubenswrapper[4959]: I1007 13:17:20.623639 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:28 crc kubenswrapper[4959]: E1007 13:17:28.543661 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 07 13:17:28 crc kubenswrapper[4959]: E1007 13:17:28.544359 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzw42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qkdbw_openstack(bd267601-4074-4cfb-8b40-8cd5fa12917c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 13:17:28 crc kubenswrapper[4959]: E1007 13:17:28.547264 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qkdbw" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.734612 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811218 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811543 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811617 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811656 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811706 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhjx\" (UniqueName: \"kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.811759 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle\") pod \"aaeb69e1-bac6-448b-832e-d6d32a47547a\" (UID: \"aaeb69e1-bac6-448b-832e-d6d32a47547a\") " Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.829421 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.830102 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx" (OuterVolumeSpecName: "kube-api-access-rqhjx") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "kube-api-access-rqhjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.830277 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts" (OuterVolumeSpecName: "scripts") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.831084 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.846715 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-strcc" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.846830 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-strcc" event={"ID":"aaeb69e1-bac6-448b-832e-d6d32a47547a","Type":"ContainerDied","Data":"d7b1fd80686664260369dbc2f6ba2d64879ee30a9e9eea421d8f6e5b4a7b1e80"} Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.846872 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b1fd80686664260369dbc2f6ba2d64879ee30a9e9eea421d8f6e5b4a7b1e80" Oct 07 13:17:28 crc kubenswrapper[4959]: E1007 13:17:28.851079 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-qkdbw" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.909040 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.914212 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhjx\" (UniqueName: \"kubernetes.io/projected/aaeb69e1-bac6-448b-832e-d6d32a47547a-kube-api-access-rqhjx\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.914250 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.914262 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.914273 4959 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.914284 4959 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:28 crc kubenswrapper[4959]: I1007 13:17:28.920392 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data" (OuterVolumeSpecName: "config-data") pod "aaeb69e1-bac6-448b-832e-d6d32a47547a" (UID: "aaeb69e1-bac6-448b-832e-d6d32a47547a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.007372 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58674f758b-wncml"] Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.015601 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaeb69e1-bac6-448b-832e-d6d32a47547a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.813538 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-54f9969c74-l8zmx"] Oct 07 13:17:29 crc kubenswrapper[4959]: E1007 13:17:29.814133 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaeb69e1-bac6-448b-832e-d6d32a47547a" containerName="keystone-bootstrap" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.814144 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaeb69e1-bac6-448b-832e-d6d32a47547a" containerName="keystone-bootstrap" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.814317 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaeb69e1-bac6-448b-832e-d6d32a47547a" containerName="keystone-bootstrap" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.814843 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.821457 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.821650 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6s5kg" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.821704 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.821735 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.821817 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827396 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-combined-ca-bundle\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827530 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-internal-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827612 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-fernet-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827694 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-public-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827723 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzftg\" (UniqueName: \"kubernetes.io/projected/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-kube-api-access-xzftg\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827796 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-config-data\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827862 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-scripts\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.827909 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-credential-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.828475 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.830026 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54f9969c74-l8zmx"] Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.859673 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kcfg6" event={"ID":"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d","Type":"ContainerStarted","Data":"708a9382a473495a0e7a1f5893576bfb745e62b1c85d24f08e67f2a93cbbbdae"} Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.866377 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc68dfcf6-xkrw7" event={"ID":"41b4db91-ead3-4028-b30c-e3e726ae6f1e","Type":"ContainerStarted","Data":"0b2f2b5ee788acb79c0a00444a00a6f9f1efc83c8a0787c8c51d3d588746c4b3"} Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.876447 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kcfg6" podStartSLOduration=12.566726607 podStartE2EDuration="34.87642503s" podCreationTimestamp="2025-10-07 13:16:55 +0000 UTC" firstStartedPulling="2025-10-07 13:17:06.212318416 +0000 UTC m=+978.373041083" lastFinishedPulling="2025-10-07 13:17:28.522016829 +0000 UTC m=+1000.682739506" observedRunningTime="2025-10-07 13:17:29.874875875 +0000 UTC m=+1002.035598542" watchObservedRunningTime="2025-10-07 13:17:29.87642503 +0000 UTC m=+1002.037147707" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.880855 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerStarted","Data":"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617"} Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.894100 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dc68dfcf6-xkrw7" podStartSLOduration=29.894081027 podStartE2EDuration="29.894081027s" podCreationTimestamp="2025-10-07 13:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:29.893580603 +0000 UTC m=+1002.054303290" watchObservedRunningTime="2025-10-07 13:17:29.894081027 +0000 UTC m=+1002.054803704" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930112 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-config-data\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930155 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-scripts\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930204 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-credential-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930294 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-combined-ca-bundle\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930352 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-internal-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930391 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-fernet-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930427 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-public-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.930443 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzftg\" (UniqueName: \"kubernetes.io/projected/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-kube-api-access-xzftg\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.936207 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-combined-ca-bundle\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.936218 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-credential-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.937945 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-scripts\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.938248 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-internal-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.938278 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-config-data\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.938560 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-fernet-keys\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.944309 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-public-tls-certs\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:29 crc kubenswrapper[4959]: I1007 13:17:29.946902 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzftg\" (UniqueName: \"kubernetes.io/projected/969d49d0-51dc-47c4-a4fb-aba1b09f4a6a-kube-api-access-xzftg\") pod \"keystone-54f9969c74-l8zmx\" (UID: \"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a\") " pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:30 crc kubenswrapper[4959]: I1007 13:17:30.144512 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:30 crc kubenswrapper[4959]: I1007 13:17:30.559323 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:30 crc kubenswrapper[4959]: I1007 13:17:30.559684 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:30 crc kubenswrapper[4959]: I1007 13:17:30.673610 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:30 crc kubenswrapper[4959]: I1007 13:17:30.673680 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:31 crc kubenswrapper[4959]: W1007 13:17:31.909296 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef3ca2a1_1eed_47fc_8454_47decce134d5.slice/crio-4cc7bfa011bd71c67beb1dd72cb0356c06f6d0483c759090f6e5466d13a9e9af WatchSource:0}: Error finding container 4cc7bfa011bd71c67beb1dd72cb0356c06f6d0483c759090f6e5466d13a9e9af: Status 404 returned error can't find the container with id 4cc7bfa011bd71c67beb1dd72cb0356c06f6d0483c759090f6e5466d13a9e9af Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.365291 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c99fbb6b6-2j7rt" podStartSLOduration=32.365272713 podStartE2EDuration="32.365272713s" podCreationTimestamp="2025-10-07 13:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:29.921078837 +0000 UTC m=+1002.081801524" watchObservedRunningTime="2025-10-07 13:17:32.365272713 +0000 UTC m=+1004.525995390" Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.370614 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54f9969c74-l8zmx"] Oct 07 13:17:32 crc kubenswrapper[4959]: W1007 13:17:32.394378 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod969d49d0_51dc_47c4_a4fb_aba1b09f4a6a.slice/crio-405bf2fed40076fbb1aa027b0d59f3e941870cc6e1b95aff8d7a96b83b919282 WatchSource:0}: Error finding container 405bf2fed40076fbb1aa027b0d59f3e941870cc6e1b95aff8d7a96b83b919282: Status 404 returned error can't find the container with id 405bf2fed40076fbb1aa027b0d59f3e941870cc6e1b95aff8d7a96b83b919282 Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.910165 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54f9969c74-l8zmx" event={"ID":"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a","Type":"ContainerStarted","Data":"7e7ee2a08f33c6941eb33c7ad8ee32d158b3249f08c44852706fef1168398507"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.910203 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54f9969c74-l8zmx" event={"ID":"969d49d0-51dc-47c4-a4fb-aba1b09f4a6a","Type":"ContainerStarted","Data":"405bf2fed40076fbb1aa027b0d59f3e941870cc6e1b95aff8d7a96b83b919282"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.910233 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.913471 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58674f758b-wncml" event={"ID":"ef3ca2a1-1eed-47fc-8454-47decce134d5","Type":"ContainerStarted","Data":"d9374b7d3fba9f4da2ec2ad21960a9f1cd6e13513d6378e2de0613605fc2c059"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.913530 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58674f758b-wncml" event={"ID":"ef3ca2a1-1eed-47fc-8454-47decce134d5","Type":"ContainerStarted","Data":"62e8e95c50d5c9a0cc607d3adaf3bfca712ff13fe9bd42e8abd51b80a4ce3ccd"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.913573 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.913586 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58674f758b-wncml" event={"ID":"ef3ca2a1-1eed-47fc-8454-47decce134d5","Type":"ContainerStarted","Data":"4cc7bfa011bd71c67beb1dd72cb0356c06f6d0483c759090f6e5466d13a9e9af"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.913612 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.914987 4959 generic.go:334] "Generic (PLEG): container finished" podID="8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" containerID="708a9382a473495a0e7a1f5893576bfb745e62b1c85d24f08e67f2a93cbbbdae" exitCode=0 Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.915042 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kcfg6" event={"ID":"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d","Type":"ContainerDied","Data":"708a9382a473495a0e7a1f5893576bfb745e62b1c85d24f08e67f2a93cbbbdae"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.917475 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerStarted","Data":"fd730955a7c41260cd0b6037668b15ee3386a0b448a83cd1ed585ec0b0724e99"} Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.933040 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-54f9969c74-l8zmx" podStartSLOduration=3.93302194 podStartE2EDuration="3.93302194s" podCreationTimestamp="2025-10-07 13:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:32.927841199 +0000 UTC m=+1005.088563946" watchObservedRunningTime="2025-10-07 13:17:32.93302194 +0000 UTC m=+1005.093744617" Oct 07 13:17:32 crc kubenswrapper[4959]: I1007 13:17:32.975484 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58674f758b-wncml" podStartSLOduration=12.975465422 podStartE2EDuration="12.975465422s" podCreationTimestamp="2025-10-07 13:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:32.970696113 +0000 UTC m=+1005.131418800" watchObservedRunningTime="2025-10-07 13:17:32.975465422 +0000 UTC m=+1005.136188099" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.336429 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.415386 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltdrg\" (UniqueName: \"kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg\") pod \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.415432 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle\") pod \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.415464 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data\") pod \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\" (UID: \"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d\") " Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.421998 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" (UID: "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.422563 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg" (OuterVolumeSpecName: "kube-api-access-ltdrg") pod "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" (UID: "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d"). InnerVolumeSpecName "kube-api-access-ltdrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.447090 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" (UID: "8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.517094 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltdrg\" (UniqueName: \"kubernetes.io/projected/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-kube-api-access-ltdrg\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.517139 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.517152 4959 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.946375 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kcfg6" event={"ID":"8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d","Type":"ContainerDied","Data":"4fbf597293e122660b6199d5a377de6267de4e633e3812ed5571a611310158b1"} Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.946454 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fbf597293e122660b6199d5a377de6267de4e633e3812ed5571a611310158b1" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.946410 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kcfg6" Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.950681 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5j2t5" event={"ID":"c5722dd5-41d2-40bd-bd65-e57d7567ecf7","Type":"ContainerDied","Data":"28650878bfd8bce39078769be19fa9e1091264b087e756d0cac0fc335d5b2582"} Oct 07 13:17:34 crc kubenswrapper[4959]: I1007 13:17:34.950608 4959 generic.go:334] "Generic (PLEG): container finished" podID="c5722dd5-41d2-40bd-bd65-e57d7567ecf7" containerID="28650878bfd8bce39078769be19fa9e1091264b087e756d0cac0fc335d5b2582" exitCode=0 Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.187326 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-878d55485-gnqkk"] Oct 07 13:17:35 crc kubenswrapper[4959]: E1007 13:17:35.187950 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" containerName="barbican-db-sync" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.187975 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" containerName="barbican-db-sync" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.188232 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" containerName="barbican-db-sync" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.189346 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.195574 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.195819 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gk68v" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.196188 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.217999 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-878d55485-gnqkk"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.307708 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.309397 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.315612 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-57fd9f6674-4cfc2"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.316960 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.320316 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.327911 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68235903-6ab3-44c7-90a1-c49f473e4568-logs\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.327954 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.327991 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqcsf\" (UniqueName: \"kubernetes.io/projected/68235903-6ab3-44c7-90a1-c49f473e4568-kube-api-access-fqcsf\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.328029 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data-custom\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.328046 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-combined-ca-bundle\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.360717 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57fd9f6674-4cfc2"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.408023 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429121 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcsf\" (UniqueName: \"kubernetes.io/projected/68235903-6ab3-44c7-90a1-c49f473e4568-kube-api-access-fqcsf\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429189 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data-custom\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429223 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-combined-ca-bundle\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429262 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z572g\" (UniqueName: \"kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429287 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429302 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data-custom\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429324 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkgh\" (UniqueName: \"kubernetes.io/projected/97567312-2948-4f23-a1e5-da00d2689376-kube-api-access-6tkgh\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429348 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97567312-2948-4f23-a1e5-da00d2689376-logs\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429385 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429413 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429443 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-combined-ca-bundle\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429482 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429508 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68235903-6ab3-44c7-90a1-c49f473e4568-logs\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.429525 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.432154 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68235903-6ab3-44c7-90a1-c49f473e4568-logs\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.456276 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data-custom\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.457691 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-config-data\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.466066 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcsf\" (UniqueName: \"kubernetes.io/projected/68235903-6ab3-44c7-90a1-c49f473e4568-kube-api-access-fqcsf\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.481312 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68235903-6ab3-44c7-90a1-c49f473e4568-combined-ca-bundle\") pod \"barbican-worker-878d55485-gnqkk\" (UID: \"68235903-6ab3-44c7-90a1-c49f473e4568\") " pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531324 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531642 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z572g\" (UniqueName: \"kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531670 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531687 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data-custom\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531709 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkgh\" (UniqueName: \"kubernetes.io/projected/97567312-2948-4f23-a1e5-da00d2689376-kube-api-access-6tkgh\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531732 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97567312-2948-4f23-a1e5-da00d2689376-logs\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531762 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531790 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531819 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-combined-ca-bundle\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.531862 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.532000 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-878d55485-gnqkk" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.535071 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97567312-2948-4f23-a1e5-da00d2689376-logs\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.536287 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.537226 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.537336 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.538287 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.541190 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.541715 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-config-data-custom\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.543411 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97567312-2948-4f23-a1e5-da00d2689376-combined-ca-bundle\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.574143 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z572g\" (UniqueName: \"kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g\") pod \"dnsmasq-dns-65d5b6b857-sjcx7\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.575905 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkgh\" (UniqueName: \"kubernetes.io/projected/97567312-2948-4f23-a1e5-da00d2689376-kube-api-access-6tkgh\") pod \"barbican-keystone-listener-57fd9f6674-4cfc2\" (UID: \"97567312-2948-4f23-a1e5-da00d2689376\") " pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.600706 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.612372 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.617601 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.623638 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.630077 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.643855 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.741553 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.741930 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.742008 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.742024 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.742053 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpg5\" (UniqueName: \"kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.843667 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.843742 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.843839 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.843869 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.843914 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpg5\" (UniqueName: \"kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.845916 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.848395 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.852063 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.861361 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.864454 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpg5\" (UniqueName: \"kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5\") pod \"barbican-api-8465f58844-ng7np\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:35 crc kubenswrapper[4959]: I1007 13:17:35.967267 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.025818 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-878d55485-gnqkk"] Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.162361 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57fd9f6674-4cfc2"] Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.208209 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.218542 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:17:36 crc kubenswrapper[4959]: W1007 13:17:36.222432 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca29064_c4e9_4bc7_80cb_d91ec2419edd.slice/crio-2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a WatchSource:0}: Error finding container 2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a: Status 404 returned error can't find the container with id 2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.252757 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8dvm\" (UniqueName: \"kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm\") pod \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.253108 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle\") pod \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.253239 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config\") pod \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\" (UID: \"c5722dd5-41d2-40bd-bd65-e57d7567ecf7\") " Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.258973 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm" (OuterVolumeSpecName: "kube-api-access-c8dvm") pod "c5722dd5-41d2-40bd-bd65-e57d7567ecf7" (UID: "c5722dd5-41d2-40bd-bd65-e57d7567ecf7"). InnerVolumeSpecName "kube-api-access-c8dvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.295706 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5722dd5-41d2-40bd-bd65-e57d7567ecf7" (UID: "c5722dd5-41d2-40bd-bd65-e57d7567ecf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.305929 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config" (OuterVolumeSpecName: "config") pod "c5722dd5-41d2-40bd-bd65-e57d7567ecf7" (UID: "c5722dd5-41d2-40bd-bd65-e57d7567ecf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.358057 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8dvm\" (UniqueName: \"kubernetes.io/projected/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-kube-api-access-c8dvm\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.358097 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.358110 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5722dd5-41d2-40bd-bd65-e57d7567ecf7-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.500873 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.980644 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-878d55485-gnqkk" event={"ID":"68235903-6ab3-44c7-90a1-c49f473e4568","Type":"ContainerStarted","Data":"282b36cca3ad809d47c41b9e1cdba228522cbc0a990a71f571f28f4ad07a38a1"} Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.983754 4959 generic.go:334] "Generic (PLEG): container finished" podID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" containerID="5bb503929b35b7a6af8b786ab06b9998d93a23df085acf2d936dfd724e0b7648" exitCode=0 Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.983800 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" event={"ID":"bca29064-c4e9-4bc7-80cb-d91ec2419edd","Type":"ContainerDied","Data":"5bb503929b35b7a6af8b786ab06b9998d93a23df085acf2d936dfd724e0b7648"} Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.983817 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" event={"ID":"bca29064-c4e9-4bc7-80cb-d91ec2419edd","Type":"ContainerStarted","Data":"2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a"} Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.993554 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerStarted","Data":"db5dc858cf6080db9c9f095574477211be79a472d22e084f2acc0203db2fdafc"} Oct 07 13:17:36 crc kubenswrapper[4959]: I1007 13:17:36.993703 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerStarted","Data":"55d665fb7fb4ef981cf695f9a95b3311468fa68ca993f5df51cd2940df9704c3"} Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.002506 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" event={"ID":"97567312-2948-4f23-a1e5-da00d2689376","Type":"ContainerStarted","Data":"5ff321485fe6f7071ce1d62765232f57545bc1d89ffd5dbcf793928456989e69"} Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.004884 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5j2t5" event={"ID":"c5722dd5-41d2-40bd-bd65-e57d7567ecf7","Type":"ContainerDied","Data":"787ac12f007e24d41521eeb8b0c1440c416155d78003b2d55c3ddd7c97e12fea"} Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.005016 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="787ac12f007e24d41521eeb8b0c1440c416155d78003b2d55c3ddd7c97e12fea" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.005207 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5j2t5" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.134907 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.187924 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:17:37 crc kubenswrapper[4959]: E1007 13:17:37.188262 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5722dd5-41d2-40bd-bd65-e57d7567ecf7" containerName="neutron-db-sync" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.188273 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5722dd5-41d2-40bd-bd65-e57d7567ecf7" containerName="neutron-db-sync" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.188434 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5722dd5-41d2-40bd-bd65-e57d7567ecf7" containerName="neutron-db-sync" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.191317 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.222687 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.233423 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.236482 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.248471 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.248741 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5sv6w" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.248891 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.249111 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.291793 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.374786 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznvf\" (UniqueName: \"kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375084 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2xg\" (UniqueName: \"kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375125 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375170 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375206 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375225 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375290 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375334 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375371 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.375395 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: E1007 13:17:37.446456 4959 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 07 13:17:37 crc kubenswrapper[4959]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/bca29064-c4e9-4bc7-80cb-d91ec2419edd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 13:17:37 crc kubenswrapper[4959]: > podSandboxID="2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a" Oct 07 13:17:37 crc kubenswrapper[4959]: E1007 13:17:37.448281 4959 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 13:17:37 crc kubenswrapper[4959]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56fh88h689h75h5f9h595h5c6h5fbhdh54bh5f7h94h89hd9hd7h65bh655h5bdh649hcch594h696h685hf9h54dh5dch57fh556h5ddh57dh556h665q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z572g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-65d5b6b857-sjcx7_openstack(bca29064-c4e9-4bc7-80cb-d91ec2419edd): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/bca29064-c4e9-4bc7-80cb-d91ec2419edd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 13:17:37 crc kubenswrapper[4959]: > logger="UnhandledError" Oct 07 13:17:37 crc kubenswrapper[4959]: E1007 13:17:37.449490 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/bca29064-c4e9-4bc7-80cb-d91ec2419edd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" podUID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477515 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477620 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477660 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477695 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477734 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznvf\" (UniqueName: \"kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477778 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2xg\" (UniqueName: \"kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477814 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.477971 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.478003 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.478030 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.479458 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.479516 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.479463 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.481127 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.484572 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.486476 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.495136 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.500008 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.501365 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2xg\" (UniqueName: \"kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg\") pod \"neutron-94d9575bd-dbgt6\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.507863 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznvf\" (UniqueName: \"kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf\") pod \"dnsmasq-dns-6dc8d75dbf-hpb8g\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.534952 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.606944 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:37 crc kubenswrapper[4959]: I1007 13:17:37.991204 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:17:37 crc kubenswrapper[4959]: W1007 13:17:37.995726 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb291ed18_9a20_42a8_9cf7_64a9ef0251db.slice/crio-1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997 WatchSource:0}: Error finding container 1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997: Status 404 returned error can't find the container with id 1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997 Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.019117 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerStarted","Data":"1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997"} Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.255418 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.397316 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z572g\" (UniqueName: \"kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g\") pod \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.397473 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config\") pod \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.397716 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb\") pod \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.397781 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb\") pod \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.397829 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc\") pod \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\" (UID: \"bca29064-c4e9-4bc7-80cb-d91ec2419edd\") " Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.402197 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g" (OuterVolumeSpecName: "kube-api-access-z572g") pod "bca29064-c4e9-4bc7-80cb-d91ec2419edd" (UID: "bca29064-c4e9-4bc7-80cb-d91ec2419edd"). InnerVolumeSpecName "kube-api-access-z572g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.450814 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bca29064-c4e9-4bc7-80cb-d91ec2419edd" (UID: "bca29064-c4e9-4bc7-80cb-d91ec2419edd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.452689 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config" (OuterVolumeSpecName: "config") pod "bca29064-c4e9-4bc7-80cb-d91ec2419edd" (UID: "bca29064-c4e9-4bc7-80cb-d91ec2419edd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.454341 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bca29064-c4e9-4bc7-80cb-d91ec2419edd" (UID: "bca29064-c4e9-4bc7-80cb-d91ec2419edd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.465084 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bca29064-c4e9-4bc7-80cb-d91ec2419edd" (UID: "bca29064-c4e9-4bc7-80cb-d91ec2419edd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.501949 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z572g\" (UniqueName: \"kubernetes.io/projected/bca29064-c4e9-4bc7-80cb-d91ec2419edd-kube-api-access-z572g\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.501986 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.501996 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.502005 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:38 crc kubenswrapper[4959]: I1007 13:17:38.502013 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca29064-c4e9-4bc7-80cb-d91ec2419edd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.033160 4959 generic.go:334] "Generic (PLEG): container finished" podID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerID="dab9b30874dfb18d1fb10a6a3f95f50922608f45ccc911b646b7f9f25a256a97" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.033468 4959 generic.go:334] "Generic (PLEG): container finished" podID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerID="bf00157c74bc07a894f3a25bad225aa723be81199730cb3659d8c902e6a5c287" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.033376 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerDied","Data":"dab9b30874dfb18d1fb10a6a3f95f50922608f45ccc911b646b7f9f25a256a97"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.033537 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerDied","Data":"bf00157c74bc07a894f3a25bad225aa723be81199730cb3659d8c902e6a5c287"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.035852 4959 generic.go:334] "Generic (PLEG): container finished" podID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerID="0387d868afea277af48228e13a937a66781088cc91bbbc889aa8139071adb0dc" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.035894 4959 generic.go:334] "Generic (PLEG): container finished" podID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerID="27a487aeb712f4d6a99f0594e41aa03d699e1f30a738134fea1c143b94f60480" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.035989 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerDied","Data":"0387d868afea277af48228e13a937a66781088cc91bbbc889aa8139071adb0dc"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.036025 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerDied","Data":"27a487aeb712f4d6a99f0594e41aa03d699e1f30a738134fea1c143b94f60480"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.040050 4959 generic.go:334] "Generic (PLEG): container finished" podID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerID="0206f4ab7265a3912f96efab6a3043e939436e74954f04ba99ff1edf6d217dcd" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.040088 4959 generic.go:334] "Generic (PLEG): container finished" podID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerID="cf4d8edf09aa8c1718091f0156df1e4a6d3ebd531821fdc63fa104e6de5e15b2" exitCode=137 Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.040130 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerDied","Data":"0206f4ab7265a3912f96efab6a3043e939436e74954f04ba99ff1edf6d217dcd"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.040182 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerDied","Data":"cf4d8edf09aa8c1718091f0156df1e4a6d3ebd531821fdc63fa104e6de5e15b2"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.042010 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerStarted","Data":"e52db1c46fa22ff6d19858d91ef640ccba2e638180f1a0ed16359462adf62033"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.044366 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" event={"ID":"bca29064-c4e9-4bc7-80cb-d91ec2419edd","Type":"ContainerDied","Data":"2f7f1884e8eaa357de1baa77aa70d80d1e23805326481ecd0e18dff34806f15a"} Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.044417 4959 scope.go:117] "RemoveContainer" containerID="5bb503929b35b7a6af8b786ab06b9998d93a23df085acf2d936dfd724e0b7648" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.044431 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d5b6b857-sjcx7" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.226443 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.240168 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65d5b6b857-sjcx7"] Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.706201 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.834902 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data\") pod \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.835049 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs\") pod \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.835120 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdgv7\" (UniqueName: \"kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7\") pod \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.835157 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts\") pod \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.835209 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key\") pod \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\" (UID: \"17e0751a-3aa2-40ec-87c6-d61c6205ff61\") " Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.836005 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs" (OuterVolumeSpecName: "logs") pod "17e0751a-3aa2-40ec-87c6-d61c6205ff61" (UID: "17e0751a-3aa2-40ec-87c6-d61c6205ff61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.842219 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7" (OuterVolumeSpecName: "kube-api-access-wdgv7") pod "17e0751a-3aa2-40ec-87c6-d61c6205ff61" (UID: "17e0751a-3aa2-40ec-87c6-d61c6205ff61"). InnerVolumeSpecName "kube-api-access-wdgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.843826 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "17e0751a-3aa2-40ec-87c6-d61c6205ff61" (UID: "17e0751a-3aa2-40ec-87c6-d61c6205ff61"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.888033 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data" (OuterVolumeSpecName: "config-data") pod "17e0751a-3aa2-40ec-87c6-d61c6205ff61" (UID: "17e0751a-3aa2-40ec-87c6-d61c6205ff61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.888731 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts" (OuterVolumeSpecName: "scripts") pod "17e0751a-3aa2-40ec-87c6-d61c6205ff61" (UID: "17e0751a-3aa2-40ec-87c6-d61c6205ff61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.937497 4959 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17e0751a-3aa2-40ec-87c6-d61c6205ff61-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.937531 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.937545 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0751a-3aa2-40ec-87c6-d61c6205ff61-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.937556 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdgv7\" (UniqueName: \"kubernetes.io/projected/17e0751a-3aa2-40ec-87c6-d61c6205ff61-kube-api-access-wdgv7\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:39 crc kubenswrapper[4959]: I1007 13:17:39.937567 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17e0751a-3aa2-40ec-87c6-d61c6205ff61-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.025545 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.055659 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerStarted","Data":"c3ae4d1755cf6b038d85ba11679848e44bee70ca4de9e7de82318ac06a200b2a"} Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.056099 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.056171 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.057949 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f68678c59-7z9rj" event={"ID":"17e0751a-3aa2-40ec-87c6-d61c6205ff61","Type":"ContainerDied","Data":"30ef9eccd85a6a30c811ff70ec8b373db7ede70bb6052b7f13ab76a5adf09db4"} Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.057996 4959 scope.go:117] "RemoveContainer" containerID="0387d868afea277af48228e13a937a66781088cc91bbbc889aa8139071adb0dc" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.058007 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f68678c59-7z9rj" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.059071 4959 generic.go:334] "Generic (PLEG): container finished" podID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerID="e52db1c46fa22ff6d19858d91ef640ccba2e638180f1a0ed16359462adf62033" exitCode=0 Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.059098 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerDied","Data":"e52db1c46fa22ff6d19858d91ef640ccba2e638180f1a0ed16359462adf62033"} Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.075026 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8465f58844-ng7np" podStartSLOduration=5.075004823 podStartE2EDuration="5.075004823s" podCreationTimestamp="2025-10-07 13:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:40.07421092 +0000 UTC m=+1012.234933597" watchObservedRunningTime="2025-10-07 13:17:40.075004823 +0000 UTC m=+1012.235727500" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.120576 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.126533 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f68678c59-7z9rj"] Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.561126 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.675928 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dc68dfcf6-xkrw7" podUID="41b4db91-ead3-4028-b30c-e3e726ae6f1e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.768651 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86cfbf9b4f-pxglw"] Oct 07 13:17:40 crc kubenswrapper[4959]: E1007 13:17:40.769079 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon-log" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769096 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon-log" Oct 07 13:17:40 crc kubenswrapper[4959]: E1007 13:17:40.769129 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769137 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon" Oct 07 13:17:40 crc kubenswrapper[4959]: E1007 13:17:40.769155 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" containerName="init" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769162 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" containerName="init" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769369 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" containerName="init" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769381 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.769398 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" containerName="horizon-log" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.770690 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.783207 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.783498 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.787870 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86cfbf9b4f-pxglw"] Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.823988 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e0751a-3aa2-40ec-87c6-d61c6205ff61" path="/var/lib/kubelet/pods/17e0751a-3aa2-40ec-87c6-d61c6205ff61/volumes" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.829947 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca29064-c4e9-4bc7-80cb-d91ec2419edd" path="/var/lib/kubelet/pods/bca29064-c4e9-4bc7-80cb-d91ec2419edd/volumes" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.953697 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-public-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954283 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-internal-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954476 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954595 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-ovndb-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954709 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zc45\" (UniqueName: \"kubernetes.io/projected/46472ab2-866f-4b3c-b030-7b05d02f9176-kube-api-access-6zc45\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954852 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-combined-ca-bundle\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:40 crc kubenswrapper[4959]: I1007 13:17:40.954979 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-httpd-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057430 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-public-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057484 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-internal-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057526 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057557 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-ovndb-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057576 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zc45\" (UniqueName: \"kubernetes.io/projected/46472ab2-866f-4b3c-b030-7b05d02f9176-kube-api-access-6zc45\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057611 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-combined-ca-bundle\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.057650 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-httpd-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.063823 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.064827 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-public-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.064914 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-httpd-config\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.065357 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-internal-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.065934 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-ovndb-tls-certs\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.069538 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46472ab2-866f-4b3c-b030-7b05d02f9176-combined-ca-bundle\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.072168 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zc45\" (UniqueName: \"kubernetes.io/projected/46472ab2-866f-4b3c-b030-7b05d02f9176-kube-api-access-6zc45\") pod \"neutron-86cfbf9b4f-pxglw\" (UID: \"46472ab2-866f-4b3c-b030-7b05d02f9176\") " pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:41 crc kubenswrapper[4959]: I1007 13:17:41.091910 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.125573 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d8db6f568-8zwbx"] Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.128457 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.134800 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.135518 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.172676 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8db6f568-8zwbx"] Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.287785 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.287918 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-logs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.287954 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-public-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.287990 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t28q\" (UniqueName: \"kubernetes.io/projected/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-kube-api-access-7t28q\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.288043 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-combined-ca-bundle\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.288083 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-internal-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.288117 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data-custom\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388509 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-public-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388549 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t28q\" (UniqueName: \"kubernetes.io/projected/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-kube-api-access-7t28q\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388589 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-combined-ca-bundle\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388617 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-internal-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388657 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data-custom\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388683 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.388750 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-logs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.389155 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-logs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.393865 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-internal-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.394021 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-public-tls-certs\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.395369 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.401544 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-combined-ca-bundle\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.406289 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-config-data-custom\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.406661 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t28q\" (UniqueName: \"kubernetes.io/projected/80c6297a-2d51-4a7b-9da0-761f69d6f3b7-kube-api-access-7t28q\") pod \"barbican-api-5d8db6f568-8zwbx\" (UID: \"80c6297a-2d51-4a7b-9da0-761f69d6f3b7\") " pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.467055 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:42 crc kubenswrapper[4959]: I1007 13:17:42.653081 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:43 crc kubenswrapper[4959]: W1007 13:17:43.670548 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e49e10_6ce8_4bde_b15c_141fe2479574.slice/crio-94a32e3fc7ddebda4df383dbb4c8cb3984d8ef994f29c660e20cd45ce17c6058 WatchSource:0}: Error finding container 94a32e3fc7ddebda4df383dbb4c8cb3984d8ef994f29c660e20cd45ce17c6058: Status 404 returned error can't find the container with id 94a32e3fc7ddebda4df383dbb4c8cb3984d8ef994f29c660e20cd45ce17c6058 Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.759330 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.766093 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.831222 4959 scope.go:117] "RemoveContainer" containerID="27a487aeb712f4d6a99f0594e41aa03d699e1f30a738134fea1c143b94f60480" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.924832 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb6sn\" (UniqueName: \"kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn\") pod \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.924900 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key\") pod \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.924963 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs\") pod \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.924983 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czz6q\" (UniqueName: \"kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q\") pod \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925080 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts\") pod \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925138 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data\") pod \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925160 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts\") pod \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925180 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data\") pod \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925238 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key\") pod \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\" (UID: \"7c7f07f0-81b0-4f10-9327-2eb3e433ee40\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.925255 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs\") pod \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\" (UID: \"1cbb164d-72db-4115-a2b4-e4f2beef4afd\") " Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.926413 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs" (OuterVolumeSpecName: "logs") pod "1cbb164d-72db-4115-a2b4-e4f2beef4afd" (UID: "1cbb164d-72db-4115-a2b4-e4f2beef4afd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.927244 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs" (OuterVolumeSpecName: "logs") pod "7c7f07f0-81b0-4f10-9327-2eb3e433ee40" (UID: "7c7f07f0-81b0-4f10-9327-2eb3e433ee40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.928847 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn" (OuterVolumeSpecName: "kube-api-access-tb6sn") pod "7c7f07f0-81b0-4f10-9327-2eb3e433ee40" (UID: "7c7f07f0-81b0-4f10-9327-2eb3e433ee40"). InnerVolumeSpecName "kube-api-access-tb6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.930019 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q" (OuterVolumeSpecName: "kube-api-access-czz6q") pod "1cbb164d-72db-4115-a2b4-e4f2beef4afd" (UID: "1cbb164d-72db-4115-a2b4-e4f2beef4afd"). InnerVolumeSpecName "kube-api-access-czz6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.930756 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1cbb164d-72db-4115-a2b4-e4f2beef4afd" (UID: "1cbb164d-72db-4115-a2b4-e4f2beef4afd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.930814 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7c7f07f0-81b0-4f10-9327-2eb3e433ee40" (UID: "7c7f07f0-81b0-4f10-9327-2eb3e433ee40"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.948047 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts" (OuterVolumeSpecName: "scripts") pod "1cbb164d-72db-4115-a2b4-e4f2beef4afd" (UID: "1cbb164d-72db-4115-a2b4-e4f2beef4afd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.949238 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts" (OuterVolumeSpecName: "scripts") pod "7c7f07f0-81b0-4f10-9327-2eb3e433ee40" (UID: "7c7f07f0-81b0-4f10-9327-2eb3e433ee40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.949326 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data" (OuterVolumeSpecName: "config-data") pod "1cbb164d-72db-4115-a2b4-e4f2beef4afd" (UID: "1cbb164d-72db-4115-a2b4-e4f2beef4afd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:43 crc kubenswrapper[4959]: I1007 13:17:43.949336 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data" (OuterVolumeSpecName: "config-data") pod "7c7f07f0-81b0-4f10-9327-2eb3e433ee40" (UID: "7c7f07f0-81b0-4f10-9327-2eb3e433ee40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027377 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czz6q\" (UniqueName: \"kubernetes.io/projected/1cbb164d-72db-4115-a2b4-e4f2beef4afd-kube-api-access-czz6q\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027416 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027428 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027439 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cbb164d-72db-4115-a2b4-e4f2beef4afd-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027463 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027473 4959 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027481 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbb164d-72db-4115-a2b4-e4f2beef4afd-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027490 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb6sn\" (UniqueName: \"kubernetes.io/projected/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-kube-api-access-tb6sn\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027498 4959 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cbb164d-72db-4115-a2b4-e4f2beef4afd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.027506 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7f07f0-81b0-4f10-9327-2eb3e433ee40-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.111718 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55884ff9b9-x27dr" event={"ID":"7c7f07f0-81b0-4f10-9327-2eb3e433ee40","Type":"ContainerDied","Data":"c55b3dc9c2061fd0a597b93db7104e96e9a65d077cc845b936db2db08d39b14f"} Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.111808 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55884ff9b9-x27dr" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.120497 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d9d976f-mvq6h" event={"ID":"1cbb164d-72db-4115-a2b4-e4f2beef4afd","Type":"ContainerDied","Data":"6e0f3742625c68f760a9726285f91811b4982b1399f03830c29e9ea4fa8bcbb4"} Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.120544 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d9d976f-mvq6h" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.124102 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerStarted","Data":"94a32e3fc7ddebda4df383dbb4c8cb3984d8ef994f29c660e20cd45ce17c6058"} Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.148161 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.161482 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55884ff9b9-x27dr"] Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.182263 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.187615 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d9d976f-mvq6h"] Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.797971 4959 scope.go:117] "RemoveContainer" containerID="0206f4ab7265a3912f96efab6a3043e939436e74954f04ba99ff1edf6d217dcd" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.821709 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" path="/var/lib/kubelet/pods/1cbb164d-72db-4115-a2b4-e4f2beef4afd/volumes" Oct 07 13:17:44 crc kubenswrapper[4959]: I1007 13:17:44.822645 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" path="/var/lib/kubelet/pods/7c7f07f0-81b0-4f10-9327-2eb3e433ee40/volumes" Oct 07 13:17:45 crc kubenswrapper[4959]: I1007 13:17:45.431392 4959 scope.go:117] "RemoveContainer" containerID="cf4d8edf09aa8c1718091f0156df1e4a6d3ebd531821fdc63fa104e6de5e15b2" Oct 07 13:17:45 crc kubenswrapper[4959]: I1007 13:17:45.572395 4959 scope.go:117] "RemoveContainer" containerID="dab9b30874dfb18d1fb10a6a3f95f50922608f45ccc911b646b7f9f25a256a97" Oct 07 13:17:45 crc kubenswrapper[4959]: E1007 13:17:45.988216 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" Oct 07 13:17:45 crc kubenswrapper[4959]: I1007 13:17:45.993938 4959 scope.go:117] "RemoveContainer" containerID="bf00157c74bc07a894f3a25bad225aa723be81199730cb3659d8c902e6a5c287" Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.045317 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86cfbf9b4f-pxglw"] Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.063339 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8db6f568-8zwbx"] Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.156051 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerStarted","Data":"33b11c80ee43caa19ebb078fa9d5df7b1c49895db7d508051650adc20a1e0692"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.157000 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.160345 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" event={"ID":"97567312-2948-4f23-a1e5-da00d2689376","Type":"ContainerStarted","Data":"ea00930057bb7f5f4b9ee889798ba7f57135df962e4d218c35ea83f2d3671d18"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.164889 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8db6f568-8zwbx" event={"ID":"80c6297a-2d51-4a7b-9da0-761f69d6f3b7","Type":"ContainerStarted","Data":"cd1cdba681c5d30778d8e5baabfc5864597426739a0a3280493ccc053f9f0cf3"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.170186 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-878d55485-gnqkk" event={"ID":"68235903-6ab3-44c7-90a1-c49f473e4568","Type":"ContainerStarted","Data":"b8c3feb897a29233fa9feb1bbe6a3a5acfa25c7f53bd88d6b325e6bf4682372c"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.171528 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86cfbf9b4f-pxglw" event={"ID":"46472ab2-866f-4b3c-b030-7b05d02f9176","Type":"ContainerStarted","Data":"212321baaec421675d39fcb88bc9e8117c9d701ce49b400bcbbb127d42b09594"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.178469 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" podStartSLOduration=9.178453159 podStartE2EDuration="9.178453159s" podCreationTimestamp="2025-10-07 13:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:46.177957035 +0000 UTC m=+1018.338679732" watchObservedRunningTime="2025-10-07 13:17:46.178453159 +0000 UTC m=+1018.339175836" Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.188471 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerStarted","Data":"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.192760 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerStarted","Data":"1da16e672e5af8fa1dde549408c2929272536fb777c707ee54df1e47344661b3"} Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.192887 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="ceilometer-notification-agent" containerID="cri-o://f9b18d1cfd14c4c147769634a5e60537e955aa28e78fb029751696cbae57a53d" gracePeriod=30 Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.192928 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.192989 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="proxy-httpd" containerID="cri-o://1da16e672e5af8fa1dde549408c2929272536fb777c707ee54df1e47344661b3" gracePeriod=30 Oct 07 13:17:46 crc kubenswrapper[4959]: I1007 13:17:46.193027 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="sg-core" containerID="cri-o://fd730955a7c41260cd0b6037668b15ee3386a0b448a83cd1ed585ec0b0724e99" gracePeriod=30 Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.203994 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerStarted","Data":"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.205048 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.209758 4959 generic.go:334] "Generic (PLEG): container finished" podID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerID="1da16e672e5af8fa1dde549408c2929272536fb777c707ee54df1e47344661b3" exitCode=0 Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.209783 4959 generic.go:334] "Generic (PLEG): container finished" podID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerID="fd730955a7c41260cd0b6037668b15ee3386a0b448a83cd1ed585ec0b0724e99" exitCode=2 Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.209788 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerDied","Data":"1da16e672e5af8fa1dde549408c2929272536fb777c707ee54df1e47344661b3"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.209820 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerDied","Data":"fd730955a7c41260cd0b6037668b15ee3386a0b448a83cd1ed585ec0b0724e99"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.217590 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8db6f568-8zwbx" event={"ID":"80c6297a-2d51-4a7b-9da0-761f69d6f3b7","Type":"ContainerStarted","Data":"fcec517512a3c0f047826f52e783baf5bf64462f21f453295c6514a2c114e776"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.217781 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8db6f568-8zwbx" event={"ID":"80c6297a-2d51-4a7b-9da0-761f69d6f3b7","Type":"ContainerStarted","Data":"6ac3fc8350412e216db8ec7f8d4dbec3b5b9d7186fea9a1bbc926a515ce123be"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.217857 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.217899 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.224426 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-94d9575bd-dbgt6" podStartSLOduration=10.224412133 podStartE2EDuration="10.224412133s" podCreationTimestamp="2025-10-07 13:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:47.222162157 +0000 UTC m=+1019.382884834" watchObservedRunningTime="2025-10-07 13:17:47.224412133 +0000 UTC m=+1019.385134810" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.243321 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkdbw" event={"ID":"bd267601-4074-4cfb-8b40-8cd5fa12917c","Type":"ContainerStarted","Data":"8869094badb76c84646e0c5bfa5b6c1571df98ca907bfc24247c204c12a28896"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.275035 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d8db6f568-8zwbx" podStartSLOduration=5.275010644 podStartE2EDuration="5.275010644s" podCreationTimestamp="2025-10-07 13:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:47.241313007 +0000 UTC m=+1019.402035694" watchObservedRunningTime="2025-10-07 13:17:47.275010644 +0000 UTC m=+1019.435733321" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.330164 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-878d55485-gnqkk" event={"ID":"68235903-6ab3-44c7-90a1-c49f473e4568","Type":"ContainerStarted","Data":"031af8777dbbaaface04b90dafa94742ed5ff096517149fbbe04a4f4e7f2774a"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.332383 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qkdbw" podStartSLOduration=6.6236984759999995 podStartE2EDuration="52.332368793s" podCreationTimestamp="2025-10-07 13:16:55 +0000 UTC" firstStartedPulling="2025-10-07 13:16:59.847708445 +0000 UTC m=+972.008431122" lastFinishedPulling="2025-10-07 13:17:45.556378762 +0000 UTC m=+1017.717101439" observedRunningTime="2025-10-07 13:17:47.305175437 +0000 UTC m=+1019.465898124" watchObservedRunningTime="2025-10-07 13:17:47.332368793 +0000 UTC m=+1019.493091470" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.360107 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-878d55485-gnqkk" podStartSLOduration=3.153141794 podStartE2EDuration="12.360086594s" podCreationTimestamp="2025-10-07 13:17:35 +0000 UTC" firstStartedPulling="2025-10-07 13:17:36.060927278 +0000 UTC m=+1008.221649955" lastFinishedPulling="2025-10-07 13:17:45.267872088 +0000 UTC m=+1017.428594755" observedRunningTime="2025-10-07 13:17:47.351048259 +0000 UTC m=+1019.511770936" watchObservedRunningTime="2025-10-07 13:17:47.360086594 +0000 UTC m=+1019.520809271" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.364102 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86cfbf9b4f-pxglw" event={"ID":"46472ab2-866f-4b3c-b030-7b05d02f9176","Type":"ContainerStarted","Data":"945dcfcc8fe4d55a17c8dcc6f0c502d332292d9be0d015a01847144ba13efc94"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.364141 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86cfbf9b4f-pxglw" event={"ID":"46472ab2-866f-4b3c-b030-7b05d02f9176","Type":"ContainerStarted","Data":"9ecec6d5ef430bfca7c5727bf0c1695aebf51e824d87482499e1a05f8257be3d"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.364154 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.386345 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86cfbf9b4f-pxglw" podStartSLOduration=7.386324522 podStartE2EDuration="7.386324522s" podCreationTimestamp="2025-10-07 13:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:47.381612634 +0000 UTC m=+1019.542335321" watchObservedRunningTime="2025-10-07 13:17:47.386324522 +0000 UTC m=+1019.547047199" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.400165 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" event={"ID":"97567312-2948-4f23-a1e5-da00d2689376","Type":"ContainerStarted","Data":"9cade37a89a464f2be90230588f36f5fbcd937e2380c5aa9232ac5b29c1c4f5e"} Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.428278 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-57fd9f6674-4cfc2" podStartSLOduration=3.35801419 podStartE2EDuration="12.428263329s" podCreationTimestamp="2025-10-07 13:17:35 +0000 UTC" firstStartedPulling="2025-10-07 13:17:36.174799031 +0000 UTC m=+1008.335521708" lastFinishedPulling="2025-10-07 13:17:45.24504817 +0000 UTC m=+1017.405770847" observedRunningTime="2025-10-07 13:17:47.427052014 +0000 UTC m=+1019.587774701" watchObservedRunningTime="2025-10-07 13:17:47.428263329 +0000 UTC m=+1019.588986006" Oct 07 13:17:47 crc kubenswrapper[4959]: I1007 13:17:47.604102 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:51 crc kubenswrapper[4959]: I1007 13:17:51.437472 4959 generic.go:334] "Generic (PLEG): container finished" podID="bd267601-4074-4cfb-8b40-8cd5fa12917c" containerID="8869094badb76c84646e0c5bfa5b6c1571df98ca907bfc24247c204c12a28896" exitCode=0 Oct 07 13:17:51 crc kubenswrapper[4959]: I1007 13:17:51.437678 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkdbw" event={"ID":"bd267601-4074-4cfb-8b40-8cd5fa12917c","Type":"ContainerDied","Data":"8869094badb76c84646e0c5bfa5b6c1571df98ca907bfc24247c204c12a28896"} Oct 07 13:17:51 crc kubenswrapper[4959]: I1007 13:17:51.854179 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:51 crc kubenswrapper[4959]: I1007 13:17:51.985254 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58674f758b-wncml" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.451740 4959 generic.go:334] "Generic (PLEG): container finished" podID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerID="f9b18d1cfd14c4c147769634a5e60537e955aa28e78fb029751696cbae57a53d" exitCode=0 Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.452019 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerDied","Data":"f9b18d1cfd14c4c147769634a5e60537e955aa28e78fb029751696cbae57a53d"} Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.536815 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.596467 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.596714 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="dnsmasq-dns" containerID="cri-o://b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc" gracePeriod=10 Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.719201 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820781 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820816 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820860 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmjs\" (UniqueName: \"kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820878 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820940 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820972 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.820992 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml\") pod \"c6955960-fcc4-4d43-9774-fafd72ee3569\" (UID: \"c6955960-fcc4-4d43-9774-fafd72ee3569\") " Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.822420 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.827761 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: E1007 13:17:52.834420 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a2196b_b17f_49ae_a7f2_3ad72d0ff043.slice/crio-conmon-b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a2196b_b17f_49ae_a7f2_3ad72d0ff043.slice/crio-b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.839001 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs" (OuterVolumeSpecName: "kube-api-access-wgmjs") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "kube-api-access-wgmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.852656 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts" (OuterVolumeSpecName: "scripts") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.908804 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.924748 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.924778 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.924789 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.924800 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmjs\" (UniqueName: \"kubernetes.io/projected/c6955960-fcc4-4d43-9774-fafd72ee3569-kube-api-access-wgmjs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.924813 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6955960-fcc4-4d43-9774-fafd72ee3569-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.946762 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.994324 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data" (OuterVolumeSpecName: "config-data") pod "c6955960-fcc4-4d43-9774-fafd72ee3569" (UID: "c6955960-fcc4-4d43-9774-fafd72ee3569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:52 crc kubenswrapper[4959]: I1007 13:17:52.997608 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.032702 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.032747 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6955960-fcc4-4d43-9774-fafd72ee3569-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.079439 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134119 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134311 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134337 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134374 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134390 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.134414 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzw42\" (UniqueName: \"kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42\") pod \"bd267601-4074-4cfb-8b40-8cd5fa12917c\" (UID: \"bd267601-4074-4cfb-8b40-8cd5fa12917c\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.135408 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.138558 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts" (OuterVolumeSpecName: "scripts") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.140529 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42" (OuterVolumeSpecName: "kube-api-access-hzw42") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "kube-api-access-hzw42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.141797 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.161788 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.204560 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data" (OuterVolumeSpecName: "config-data") pod "bd267601-4074-4cfb-8b40-8cd5fa12917c" (UID: "bd267601-4074-4cfb-8b40-8cd5fa12917c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.235569 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc\") pod \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.235645 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb\") pod \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.235674 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb\") pod \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.235742 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csvj2\" (UniqueName: \"kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2\") pod \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.235762 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config\") pod \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\" (UID: \"05a2196b-b17f-49ae-a7f2-3ad72d0ff043\") " Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236149 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzw42\" (UniqueName: \"kubernetes.io/projected/bd267601-4074-4cfb-8b40-8cd5fa12917c-kube-api-access-hzw42\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236165 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd267601-4074-4cfb-8b40-8cd5fa12917c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236175 4959 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236187 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236197 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.236206 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd267601-4074-4cfb-8b40-8cd5fa12917c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.254839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2" (OuterVolumeSpecName: "kube-api-access-csvj2") pod "05a2196b-b17f-49ae-a7f2-3ad72d0ff043" (UID: "05a2196b-b17f-49ae-a7f2-3ad72d0ff043"). InnerVolumeSpecName "kube-api-access-csvj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.272062 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.288019 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05a2196b-b17f-49ae-a7f2-3ad72d0ff043" (UID: "05a2196b-b17f-49ae-a7f2-3ad72d0ff043"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.304819 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.305530 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05a2196b-b17f-49ae-a7f2-3ad72d0ff043" (UID: "05a2196b-b17f-49ae-a7f2-3ad72d0ff043"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.311044 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05a2196b-b17f-49ae-a7f2-3ad72d0ff043" (UID: "05a2196b-b17f-49ae-a7f2-3ad72d0ff043"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.317841 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config" (OuterVolumeSpecName: "config") pod "05a2196b-b17f-49ae-a7f2-3ad72d0ff043" (UID: "05a2196b-b17f-49ae-a7f2-3ad72d0ff043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.339759 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.339793 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.339804 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.339814 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csvj2\" (UniqueName: \"kubernetes.io/projected/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-kube-api-access-csvj2\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.339825 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a2196b-b17f-49ae-a7f2-3ad72d0ff043-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.460907 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkdbw" event={"ID":"bd267601-4074-4cfb-8b40-8cd5fa12917c","Type":"ContainerDied","Data":"225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a"} Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.460945 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225cb415ae5ed76787d54d42ae8464a674eb6917c08840f3fe74f9f30a0f564a" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.460948 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkdbw" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.462909 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6955960-fcc4-4d43-9774-fafd72ee3569","Type":"ContainerDied","Data":"3cb2f75c7a7f0adb7f037263ea991b8c053e1bdfa1d031e374efd4b5e046891a"} Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.462952 4959 scope.go:117] "RemoveContainer" containerID="1da16e672e5af8fa1dde549408c2929272536fb777c707ee54df1e47344661b3" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.463112 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.470180 4959 generic.go:334] "Generic (PLEG): container finished" podID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerID="b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc" exitCode=0 Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.470312 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" event={"ID":"05a2196b-b17f-49ae-a7f2-3ad72d0ff043","Type":"ContainerDied","Data":"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc"} Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.470393 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" event={"ID":"05a2196b-b17f-49ae-a7f2-3ad72d0ff043","Type":"ContainerDied","Data":"f218edea7cc631260dad9745a991b42dbc37b8f4bc8730ff0acbdc4108037586"} Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.470498 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b99bccc6c-wnttr" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.489815 4959 scope.go:117] "RemoveContainer" containerID="fd730955a7c41260cd0b6037668b15ee3386a0b448a83cd1ed585ec0b0724e99" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.516423 4959 scope.go:117] "RemoveContainer" containerID="f9b18d1cfd14c4c147769634a5e60537e955aa28e78fb029751696cbae57a53d" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.520882 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.527587 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b99bccc6c-wnttr"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.534399 4959 scope.go:117] "RemoveContainer" containerID="b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.568760 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.581691 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.590800 4959 scope.go:117] "RemoveContainer" containerID="606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.596756 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598217 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598241 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598269 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598276 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598302 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="sg-core" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598309 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="sg-core" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598321 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598345 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598354 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="ceilometer-notification-agent" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598359 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="ceilometer-notification-agent" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598370 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="dnsmasq-dns" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598376 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="dnsmasq-dns" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598386 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="init" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598391 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="init" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598422 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="proxy-httpd" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598429 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="proxy-httpd" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598448 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" containerName="cinder-db-sync" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598454 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" containerName="cinder-db-sync" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.598461 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598467 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598748 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598763 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="proxy-httpd" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598770 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7f07f0-81b0-4f10-9327-2eb3e433ee40" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598780 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" containerName="cinder-db-sync" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598793 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon-log" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598842 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" containerName="dnsmasq-dns" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598856 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbb164d-72db-4115-a2b4-e4f2beef4afd" containerName="horizon" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.598870 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="sg-core" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.601733 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" containerName="ceilometer-notification-agent" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.617535 4959 scope.go:117] "RemoveContainer" containerID="b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.618806 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.618906 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.621875 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc\": container with ID starting with b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc not found: ID does not exist" containerID="b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.621918 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc"} err="failed to get container status \"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc\": rpc error: code = NotFound desc = could not find container \"b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc\": container with ID starting with b6d192463e9db3d6ac42057f01be8c36bbb29e3d8f216cebbb8e2cbcfa6a3acc not found: ID does not exist" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.621948 4959 scope.go:117] "RemoveContainer" containerID="606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.621977 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.622227 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:17:53 crc kubenswrapper[4959]: E1007 13:17:53.626927 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46\": container with ID starting with 606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46 not found: ID does not exist" containerID="606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.626976 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46"} err="failed to get container status \"606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46\": rpc error: code = NotFound desc = could not find container \"606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46\": container with ID starting with 606e030fa3783154c5cf7047761b556212b88eb4e3485612fbd27e18e9e4bc46 not found: ID does not exist" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651681 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651736 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651760 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651791 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjnt\" (UniqueName: \"kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651814 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651837 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.651932 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756527 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756652 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756684 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756735 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjnt\" (UniqueName: \"kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756758 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.756782 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.757430 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.762331 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.770188 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.771801 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.781953 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.782059 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.782452 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.783952 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.785501 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjnt\" (UniqueName: \"kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt\") pod \"ceilometer-0\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " pod="openstack/ceilometer-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.791763 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.792082 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.792316 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.792906 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rhk7w" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.794458 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.824317 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.828598 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861547 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6t9\" (UniqueName: \"kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861612 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861665 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861724 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861764 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861804 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861855 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861879 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l96l8\" (UniqueName: \"kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861902 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.861925 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.906708 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.955580 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.957128 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964586 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964656 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964685 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964721 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964738 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964755 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964771 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l96l8\" (UniqueName: \"kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964790 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964807 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964824 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964870 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964899 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6t9\" (UniqueName: \"kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964922 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964946 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrz9t\" (UniqueName: \"kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964974 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.964996 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.965039 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.965542 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.965839 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.973570 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.974389 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.975020 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.975136 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.976596 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.980010 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.988334 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.989282 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:53 crc kubenswrapper[4959]: I1007 13:17:53.990176 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.000098 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6t9\" (UniqueName: \"kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9\") pod \"cinder-scheduler-0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " pod="openstack/cinder-scheduler-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.000552 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.014474 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l96l8\" (UniqueName: \"kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8\") pod \"dnsmasq-dns-85494b87f-gljj2\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066537 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066864 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066885 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066915 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066959 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.066991 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.067008 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrz9t\" (UniqueName: \"kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.068279 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.069739 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.072761 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.072947 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.081409 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.093323 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.094018 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrz9t\" (UniqueName: \"kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t\") pod \"cinder-api-0\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.199609 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.219534 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.292059 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.646443 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:17:54 crc kubenswrapper[4959]: W1007 13:17:54.654573 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17188cfd_fbac_49d7_86a0_b61de72bd81c.slice/crio-8520c5b7d83774c480837419bff1da19e878a1fab6132961602c77a927ec5693 WatchSource:0}: Error finding container 8520c5b7d83774c480837419bff1da19e878a1fab6132961602c77a927ec5693: Status 404 returned error can't find the container with id 8520c5b7d83774c480837419bff1da19e878a1fab6132961602c77a927ec5693 Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.824421 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a2196b-b17f-49ae-a7f2-3ad72d0ff043" path="/var/lib/kubelet/pods/05a2196b-b17f-49ae-a7f2-3ad72d0ff043/volumes" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.825055 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6955960-fcc4-4d43-9774-fafd72ee3569" path="/var/lib/kubelet/pods/c6955960-fcc4-4d43-9774-fafd72ee3569/volumes" Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.940048 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:17:54 crc kubenswrapper[4959]: W1007 13:17:54.945347 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11336708_94ba_4b0c_b4c4_1f3bb24f44b0.slice/crio-6fb3b4a14748c6bc20b60e64d814f8241c9a528844f24c212ac54d1b1f4d63dc WatchSource:0}: Error finding container 6fb3b4a14748c6bc20b60e64d814f8241c9a528844f24c212ac54d1b1f4d63dc: Status 404 returned error can't find the container with id 6fb3b4a14748c6bc20b60e64d814f8241c9a528844f24c212ac54d1b1f4d63dc Oct 07 13:17:54 crc kubenswrapper[4959]: W1007 13:17:54.953608 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a05d93_0c05_45c0_872e_decc26f3fb0e.slice/crio-6eaf6b62d947fac20be89ffb3eb65058ded3eea830af1a77a7e9aba9b0738a50 WatchSource:0}: Error finding container 6eaf6b62d947fac20be89ffb3eb65058ded3eea830af1a77a7e9aba9b0738a50: Status 404 returned error can't find the container with id 6eaf6b62d947fac20be89ffb3eb65058ded3eea830af1a77a7e9aba9b0738a50 Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.963722 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:17:54 crc kubenswrapper[4959]: I1007 13:17:54.980905 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.004796 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.345264 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8db6f568-8zwbx" Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.412588 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.412874 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8465f58844-ng7np" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api-log" containerID="cri-o://db5dc858cf6080db9c9f095574477211be79a472d22e084f2acc0203db2fdafc" gracePeriod=30 Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.413471 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8465f58844-ng7np" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api" containerID="cri-o://c3ae4d1755cf6b038d85ba11679848e44bee70ca4de9e7de82318ac06a200b2a" gracePeriod=30 Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.506980 4959 generic.go:334] "Generic (PLEG): container finished" podID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerID="99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef" exitCode=0 Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.507254 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85494b87f-gljj2" event={"ID":"32a05d93-0c05-45c0-872e-decc26f3fb0e","Type":"ContainerDied","Data":"99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef"} Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.507302 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85494b87f-gljj2" event={"ID":"32a05d93-0c05-45c0-872e-decc26f3fb0e","Type":"ContainerStarted","Data":"6eaf6b62d947fac20be89ffb3eb65058ded3eea830af1a77a7e9aba9b0738a50"} Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.540748 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerStarted","Data":"6fb3b4a14748c6bc20b60e64d814f8241c9a528844f24c212ac54d1b1f4d63dc"} Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.572774 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerStarted","Data":"8520c5b7d83774c480837419bff1da19e878a1fab6132961602c77a927ec5693"} Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.575858 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerStarted","Data":"9d37e0cb8e13e86c49aaf4c9a1c270fb9d5308e8993c465da5acd58a399a09e8"} Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.641359 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dc68dfcf6-xkrw7" Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.726665 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.729193 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" containerID="cri-o://52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617" gracePeriod=30 Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.728762 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon-log" containerID="cri-o://9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9" gracePeriod=30 Oct 07 13:17:55 crc kubenswrapper[4959]: I1007 13:17:55.751162 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.486098 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.601049 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerStarted","Data":"701fbbcbc18dbd963c5084b28171f71652f596e741de83b02f9a863893d3b4db"} Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.604057 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerStarted","Data":"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286"} Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.611146 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85494b87f-gljj2" event={"ID":"32a05d93-0c05-45c0-872e-decc26f3fb0e","Type":"ContainerStarted","Data":"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017"} Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.612219 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.615973 4959 generic.go:334] "Generic (PLEG): container finished" podID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerID="db5dc858cf6080db9c9f095574477211be79a472d22e084f2acc0203db2fdafc" exitCode=143 Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.616032 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerDied","Data":"db5dc858cf6080db9c9f095574477211be79a472d22e084f2acc0203db2fdafc"} Oct 07 13:17:56 crc kubenswrapper[4959]: I1007 13:17:56.652282 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85494b87f-gljj2" podStartSLOduration=3.652264608 podStartE2EDuration="3.652264608s" podCreationTimestamp="2025-10-07 13:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:56.652024091 +0000 UTC m=+1028.812746768" watchObservedRunningTime="2025-10-07 13:17:56.652264608 +0000 UTC m=+1028.812987285" Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.626305 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerStarted","Data":"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab"} Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.626878 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerStarted","Data":"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c"} Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.628154 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerStarted","Data":"d59472bd3244d672cb428691586ba5569242112c50c4c389c596c40876b9685d"} Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.630469 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerStarted","Data":"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0"} Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.630694 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api-log" containerID="cri-o://51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" gracePeriod=30 Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.630708 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api" containerID="cri-o://59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" gracePeriod=30 Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.654255 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.759291451 podStartE2EDuration="4.654236844s" podCreationTimestamp="2025-10-07 13:17:53 +0000 UTC" firstStartedPulling="2025-10-07 13:17:54.95318954 +0000 UTC m=+1027.113912217" lastFinishedPulling="2025-10-07 13:17:55.848134933 +0000 UTC m=+1028.008857610" observedRunningTime="2025-10-07 13:17:57.646144327 +0000 UTC m=+1029.806866994" watchObservedRunningTime="2025-10-07 13:17:57.654236844 +0000 UTC m=+1029.814959521" Oct 07 13:17:57 crc kubenswrapper[4959]: I1007 13:17:57.661856 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.661837137 podStartE2EDuration="4.661837137s" podCreationTimestamp="2025-10-07 13:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:17:57.660951261 +0000 UTC m=+1029.821673948" watchObservedRunningTime="2025-10-07 13:17:57.661837137 +0000 UTC m=+1029.822559814" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.138154 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283683 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrz9t\" (UniqueName: \"kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283746 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283779 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283804 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283860 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283890 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.283970 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts\") pod \"fe145fd4-27dd-45b7-8a62-d56076785e9b\" (UID: \"fe145fd4-27dd-45b7-8a62-d56076785e9b\") " Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.284288 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.290011 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs" (OuterVolumeSpecName: "logs") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.298827 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.298904 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts" (OuterVolumeSpecName: "scripts") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.302897 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t" (OuterVolumeSpecName: "kube-api-access-zrz9t") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "kube-api-access-zrz9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.316795 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.355334 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data" (OuterVolumeSpecName: "config-data") pod "fe145fd4-27dd-45b7-8a62-d56076785e9b" (UID: "fe145fd4-27dd-45b7-8a62-d56076785e9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387502 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387546 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrz9t\" (UniqueName: \"kubernetes.io/projected/fe145fd4-27dd-45b7-8a62-d56076785e9b-kube-api-access-zrz9t\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387561 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387574 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe145fd4-27dd-45b7-8a62-d56076785e9b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387585 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe145fd4-27dd-45b7-8a62-d56076785e9b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387611 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.387636 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe145fd4-27dd-45b7-8a62-d56076785e9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.574512 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8465f58844-ng7np" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": read tcp 10.217.0.2:47782->10.217.0.151:9311: read: connection reset by peer" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.574512 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8465f58844-ng7np" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": read tcp 10.217.0.2:47798->10.217.0.151:9311: read: connection reset by peer" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656836 4959 generic.go:334] "Generic (PLEG): container finished" podID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerID="59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" exitCode=0 Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656865 4959 generic.go:334] "Generic (PLEG): container finished" podID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerID="51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" exitCode=143 Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656922 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerDied","Data":"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0"} Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656949 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerDied","Data":"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286"} Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656959 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe145fd4-27dd-45b7-8a62-d56076785e9b","Type":"ContainerDied","Data":"9d37e0cb8e13e86c49aaf4c9a1c270fb9d5308e8993c465da5acd58a399a09e8"} Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.656971 4959 scope.go:117] "RemoveContainer" containerID="59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.657086 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.675468 4959 generic.go:334] "Generic (PLEG): container finished" podID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerID="c3ae4d1755cf6b038d85ba11679848e44bee70ca4de9e7de82318ac06a200b2a" exitCode=0 Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.675579 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerDied","Data":"c3ae4d1755cf6b038d85ba11679848e44bee70ca4de9e7de82318ac06a200b2a"} Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.705867 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerStarted","Data":"4ac2f6266ebca71342c7f640379bd3461a164eef8ef4b9abadea2be8d0c1f7db"} Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.738596 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.752464 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.761602 4959 scope.go:117] "RemoveContainer" containerID="51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.761695 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:58 crc kubenswrapper[4959]: E1007 13:17:58.762185 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api-log" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.762204 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api-log" Oct 07 13:17:58 crc kubenswrapper[4959]: E1007 13:17:58.762219 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.762225 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.762390 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api-log" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.762414 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" containerName="cinder-api" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.763643 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.765686 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.765892 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.768290 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.783737 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.819787 4959 scope.go:117] "RemoveContainer" containerID="59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" Oct 07 13:17:58 crc kubenswrapper[4959]: E1007 13:17:58.820279 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0\": container with ID starting with 59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0 not found: ID does not exist" containerID="59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.820318 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0"} err="failed to get container status \"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0\": rpc error: code = NotFound desc = could not find container \"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0\": container with ID starting with 59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0 not found: ID does not exist" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.820345 4959 scope.go:117] "RemoveContainer" containerID="51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" Oct 07 13:17:58 crc kubenswrapper[4959]: E1007 13:17:58.823804 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286\": container with ID starting with 51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286 not found: ID does not exist" containerID="51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.823836 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286"} err="failed to get container status \"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286\": rpc error: code = NotFound desc = could not find container \"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286\": container with ID starting with 51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286 not found: ID does not exist" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.823859 4959 scope.go:117] "RemoveContainer" containerID="59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.824233 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0"} err="failed to get container status \"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0\": rpc error: code = NotFound desc = could not find container \"59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0\": container with ID starting with 59a154ee4216dbe5b840b68a44c4bbaeeb54189d0f019b1d333a171528242ad0 not found: ID does not exist" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.824258 4959 scope.go:117] "RemoveContainer" containerID="51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.824466 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286"} err="failed to get container status \"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286\": rpc error: code = NotFound desc = could not find container \"51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286\": container with ID starting with 51c26859b5c06d7866d805c0dba8384d9b349cc4c417bb80b053b21a1f3e0286 not found: ID does not exist" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.827239 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe145fd4-27dd-45b7-8a62-d56076785e9b" path="/var/lib/kubelet/pods/fe145fd4-27dd-45b7-8a62-d56076785e9b/volumes" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.900573 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.900664 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.900684 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a9118f-48be-4663-ba53-6e107a5d09e8-logs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.900808 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a9118f-48be-4663-ba53-6e107a5d09e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.901004 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.901037 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzlh\" (UniqueName: \"kubernetes.io/projected/54a9118f-48be-4663-ba53-6e107a5d09e8-kube-api-access-sxzlh\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.901065 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.901330 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.901606 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-scripts\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:58 crc kubenswrapper[4959]: I1007 13:17:58.987879 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.003807 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a9118f-48be-4663-ba53-6e107a5d09e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.003951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.003989 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzlh\" (UniqueName: \"kubernetes.io/projected/54a9118f-48be-4663-ba53-6e107a5d09e8-kube-api-access-sxzlh\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004010 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004050 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004139 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-scripts\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004231 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004309 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.004335 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a9118f-48be-4663-ba53-6e107a5d09e8-logs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.005375 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a9118f-48be-4663-ba53-6e107a5d09e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.016297 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.018094 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.019205 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.022570 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.023953 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-scripts\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.025039 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a9118f-48be-4663-ba53-6e107a5d09e8-logs\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.025563 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9118f-48be-4663-ba53-6e107a5d09e8-config-data\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.032230 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzlh\" (UniqueName: \"kubernetes.io/projected/54a9118f-48be-4663-ba53-6e107a5d09e8-kube-api-access-sxzlh\") pod \"cinder-api-0\" (UID: \"54a9118f-48be-4663-ba53-6e107a5d09e8\") " pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.095941 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.105928 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs\") pod \"4e95dc55-c289-4042-80c7-bc5253f80e0f\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.105999 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dpg5\" (UniqueName: \"kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5\") pod \"4e95dc55-c289-4042-80c7-bc5253f80e0f\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.106036 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom\") pod \"4e95dc55-c289-4042-80c7-bc5253f80e0f\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.106057 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle\") pod \"4e95dc55-c289-4042-80c7-bc5253f80e0f\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.106123 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data\") pod \"4e95dc55-c289-4042-80c7-bc5253f80e0f\" (UID: \"4e95dc55-c289-4042-80c7-bc5253f80e0f\") " Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.106301 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs" (OuterVolumeSpecName: "logs") pod "4e95dc55-c289-4042-80c7-bc5253f80e0f" (UID: "4e95dc55-c289-4042-80c7-bc5253f80e0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.106639 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e95dc55-c289-4042-80c7-bc5253f80e0f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.108955 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5" (OuterVolumeSpecName: "kube-api-access-9dpg5") pod "4e95dc55-c289-4042-80c7-bc5253f80e0f" (UID: "4e95dc55-c289-4042-80c7-bc5253f80e0f"). InnerVolumeSpecName "kube-api-access-9dpg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.109882 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e95dc55-c289-4042-80c7-bc5253f80e0f" (UID: "4e95dc55-c289-4042-80c7-bc5253f80e0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.136339 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e95dc55-c289-4042-80c7-bc5253f80e0f" (UID: "4e95dc55-c289-4042-80c7-bc5253f80e0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.143681 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:38168->10.217.0.143:8443: read: connection reset by peer" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.175001 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data" (OuterVolumeSpecName: "config-data") pod "4e95dc55-c289-4042-80c7-bc5253f80e0f" (UID: "4e95dc55-c289-4042-80c7-bc5253f80e0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.199957 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.210580 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dpg5\" (UniqueName: \"kubernetes.io/projected/4e95dc55-c289-4042-80c7-bc5253f80e0f-kube-api-access-9dpg5\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.210621 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.210651 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.210664 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e95dc55-c289-4042-80c7-bc5253f80e0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.552791 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.714136 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8465f58844-ng7np" event={"ID":"4e95dc55-c289-4042-80c7-bc5253f80e0f","Type":"ContainerDied","Data":"55d665fb7fb4ef981cf695f9a95b3311468fa68ca993f5df51cd2940df9704c3"} Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.714163 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8465f58844-ng7np" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.714200 4959 scope.go:117] "RemoveContainer" containerID="c3ae4d1755cf6b038d85ba11679848e44bee70ca4de9e7de82318ac06a200b2a" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.720741 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerStarted","Data":"feb43f9bc0ea194654d03a90a38be34787fbfbd0b13ee4d3b87facdbc413fc01"} Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.720873 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.726474 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a9118f-48be-4663-ba53-6e107a5d09e8","Type":"ContainerStarted","Data":"02315eaaa335a1c195253c91d618ab6fba0d97fd1a09673bcc2833d13e0a3fd6"} Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.728400 4959 generic.go:334] "Generic (PLEG): container finished" podID="468d7f11-8929-410e-a59d-1f78cc33a279" containerID="52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617" exitCode=0 Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.728949 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerDied","Data":"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617"} Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.748820 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.004364628 podStartE2EDuration="6.748804168s" podCreationTimestamp="2025-10-07 13:17:53 +0000 UTC" firstStartedPulling="2025-10-07 13:17:54.65798296 +0000 UTC m=+1026.818705637" lastFinishedPulling="2025-10-07 13:17:59.4024225 +0000 UTC m=+1031.563145177" observedRunningTime="2025-10-07 13:17:59.744998997 +0000 UTC m=+1031.905721674" watchObservedRunningTime="2025-10-07 13:17:59.748804168 +0000 UTC m=+1031.909526835" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.750484 4959 scope.go:117] "RemoveContainer" containerID="db5dc858cf6080db9c9f095574477211be79a472d22e084f2acc0203db2fdafc" Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.776249 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:17:59 crc kubenswrapper[4959]: I1007 13:17:59.792994 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8465f58844-ng7np"] Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.559528 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.741533 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a9118f-48be-4663-ba53-6e107a5d09e8","Type":"ContainerStarted","Data":"d468afab634d2947f6b779523978c4824ab773f6716d8df01ef33d2c07033bb5"} Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.741581 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a9118f-48be-4663-ba53-6e107a5d09e8","Type":"ContainerStarted","Data":"5b7f361d486595356799db15d1103150de1c5d9ec95df1bdb0e663d9d527aec7"} Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.741676 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.769740 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.769723029 podStartE2EDuration="2.769723029s" podCreationTimestamp="2025-10-07 13:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:18:00.766813683 +0000 UTC m=+1032.927536400" watchObservedRunningTime="2025-10-07 13:18:00.769723029 +0000 UTC m=+1032.930445696" Oct 07 13:18:00 crc kubenswrapper[4959]: I1007 13:18:00.821377 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" path="/var/lib/kubelet/pods/4e95dc55-c289-4042-80c7-bc5253f80e0f/volumes" Oct 07 13:18:01 crc kubenswrapper[4959]: I1007 13:18:01.890528 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-54f9969c74-l8zmx" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.220816 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.302129 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.302353 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="dnsmasq-dns" containerID="cri-o://33b11c80ee43caa19ebb078fa9d5df7b1c49895db7d508051650adc20a1e0692" gracePeriod=10 Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.496188 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.549538 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.785809 4959 generic.go:334] "Generic (PLEG): container finished" podID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerID="33b11c80ee43caa19ebb078fa9d5df7b1c49895db7d508051650adc20a1e0692" exitCode=0 Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.786064 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="cinder-scheduler" containerID="cri-o://4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c" gracePeriod=30 Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.786389 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerDied","Data":"33b11c80ee43caa19ebb078fa9d5df7b1c49895db7d508051650adc20a1e0692"} Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.786423 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" event={"ID":"b291ed18-9a20-42a8-9cf7-64a9ef0251db","Type":"ContainerDied","Data":"1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997"} Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.786437 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6c3c47fa4ad666c4d9e80d4066293f1a9b49ed7c428e15cde17ab768407997" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.786792 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="probe" containerID="cri-o://89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab" gracePeriod=30 Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.830228 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.919889 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznvf\" (UniqueName: \"kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf\") pod \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.920019 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc\") pod \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.920059 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb\") pod \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.920089 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config\") pod \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.920165 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb\") pod \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\" (UID: \"b291ed18-9a20-42a8-9cf7-64a9ef0251db\") " Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.928205 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf" (OuterVolumeSpecName: "kube-api-access-gznvf") pod "b291ed18-9a20-42a8-9cf7-64a9ef0251db" (UID: "b291ed18-9a20-42a8-9cf7-64a9ef0251db"). InnerVolumeSpecName "kube-api-access-gznvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.975058 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b291ed18-9a20-42a8-9cf7-64a9ef0251db" (UID: "b291ed18-9a20-42a8-9cf7-64a9ef0251db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.977498 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b291ed18-9a20-42a8-9cf7-64a9ef0251db" (UID: "b291ed18-9a20-42a8-9cf7-64a9ef0251db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.980127 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b291ed18-9a20-42a8-9cf7-64a9ef0251db" (UID: "b291ed18-9a20-42a8-9cf7-64a9ef0251db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:04 crc kubenswrapper[4959]: I1007 13:18:04.992937 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config" (OuterVolumeSpecName: "config") pod "b291ed18-9a20-42a8-9cf7-64a9ef0251db" (UID: "b291ed18-9a20-42a8-9cf7-64a9ef0251db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.022138 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.022185 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.022195 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.022207 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b291ed18-9a20-42a8-9cf7-64a9ef0251db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.022216 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznvf\" (UniqueName: \"kubernetes.io/projected/b291ed18-9a20-42a8-9cf7-64a9ef0251db-kube-api-access-gznvf\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.797350 4959 generic.go:334] "Generic (PLEG): container finished" podID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerID="89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab" exitCode=0 Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.797446 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerDied","Data":"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab"} Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.797778 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc8d75dbf-hpb8g" Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.838459 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:18:05 crc kubenswrapper[4959]: I1007 13:18:05.845618 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc8d75dbf-hpb8g"] Oct 07 13:18:06 crc kubenswrapper[4959]: I1007 13:18:06.817939 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" path="/var/lib/kubelet/pods/b291ed18-9a20-42a8-9cf7-64a9ef0251db/volumes" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.008497 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 13:18:07 crc kubenswrapper[4959]: E1007 13:18:07.009536 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="init" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009573 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="init" Oct 07 13:18:07 crc kubenswrapper[4959]: E1007 13:18:07.009597 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009605 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api" Oct 07 13:18:07 crc kubenswrapper[4959]: E1007 13:18:07.009645 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api-log" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009656 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api-log" Oct 07 13:18:07 crc kubenswrapper[4959]: E1007 13:18:07.009667 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="dnsmasq-dns" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009675 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="dnsmasq-dns" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009932 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009948 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b291ed18-9a20-42a8-9cf7-64a9ef0251db" containerName="dnsmasq-dns" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.009961 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e95dc55-c289-4042-80c7-bc5253f80e0f" containerName="barbican-api-log" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.010808 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.012570 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gbgml" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.013689 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.013961 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.022605 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.162556 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config-secret\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.162603 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.162657 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qp5p\" (UniqueName: \"kubernetes.io/projected/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-kube-api-access-6qp5p\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.162833 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.264555 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config-secret\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.264616 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.264671 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qp5p\" (UniqueName: \"kubernetes.io/projected/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-kube-api-access-6qp5p\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.264721 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.265534 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.270299 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-openstack-config-secret\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.279332 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.284123 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qp5p\" (UniqueName: \"kubernetes.io/projected/eab0abf5-c944-4a5c-9259-6dc0ea2b115f-kube-api-access-6qp5p\") pod \"openstackclient\" (UID: \"eab0abf5-c944-4a5c-9259-6dc0ea2b115f\") " pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.338953 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.635452 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:18:07 crc kubenswrapper[4959]: I1007 13:18:07.841201 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.581194 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689688 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689805 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689802 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689854 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689876 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.689985 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.690011 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6t9\" (UniqueName: \"kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9\") pod \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\" (UID: \"11336708-94ba-4b0c-b4c4-1f3bb24f44b0\") " Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.690341 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.698924 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts" (OuterVolumeSpecName: "scripts") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.698956 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9" (OuterVolumeSpecName: "kube-api-access-ct6t9") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "kube-api-access-ct6t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.711601 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.746576 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.792402 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.792437 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.792450 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.792461 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6t9\" (UniqueName: \"kubernetes.io/projected/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-kube-api-access-ct6t9\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.824056 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data" (OuterVolumeSpecName: "config-data") pod "11336708-94ba-4b0c-b4c4-1f3bb24f44b0" (UID: "11336708-94ba-4b0c-b4c4-1f3bb24f44b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.825181 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eab0abf5-c944-4a5c-9259-6dc0ea2b115f","Type":"ContainerStarted","Data":"06500f44db23d72621f2cd76997107a2bf26230b6178d66937528710daa734f7"} Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.828946 4959 generic.go:334] "Generic (PLEG): container finished" podID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerID="4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c" exitCode=0 Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.829008 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.829012 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerDied","Data":"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c"} Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.829073 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11336708-94ba-4b0c-b4c4-1f3bb24f44b0","Type":"ContainerDied","Data":"6fb3b4a14748c6bc20b60e64d814f8241c9a528844f24c212ac54d1b1f4d63dc"} Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.829097 4959 scope.go:117] "RemoveContainer" containerID="89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.858692 4959 scope.go:117] "RemoveContainer" containerID="4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.868756 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.883657 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.893699 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336708-94ba-4b0c-b4c4-1f3bb24f44b0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.894263 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:08 crc kubenswrapper[4959]: E1007 13:18:08.894642 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="cinder-scheduler" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.894678 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="cinder-scheduler" Oct 07 13:18:08 crc kubenswrapper[4959]: E1007 13:18:08.894704 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="probe" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.894711 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="probe" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.894908 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="cinder-scheduler" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.894929 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" containerName="probe" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.895066 4959 scope.go:117] "RemoveContainer" containerID="89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab" Oct 07 13:18:08 crc kubenswrapper[4959]: E1007 13:18:08.895518 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab\": container with ID starting with 89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab not found: ID does not exist" containerID="89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.895550 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab"} err="failed to get container status \"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab\": rpc error: code = NotFound desc = could not find container \"89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab\": container with ID starting with 89ab3923f4c660252cb246b09fc17d659555a09e3cc3830ac27a70590115c6ab not found: ID does not exist" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.895571 4959 scope.go:117] "RemoveContainer" containerID="4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.895834 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: E1007 13:18:08.897422 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c\": container with ID starting with 4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c not found: ID does not exist" containerID="4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.897469 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c"} err="failed to get container status \"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c\": rpc error: code = NotFound desc = could not find container \"4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c\": container with ID starting with 4754020e6b7f18d8fb49e35c5f1901a4e63f7cadbd3fbb5ffa0339d0fb9e3c6c not found: ID does not exist" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.899594 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.902204 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.994987 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.995232 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.995287 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmks\" (UniqueName: \"kubernetes.io/projected/96e0aa23-8c42-4616-af38-0eb612e5f181-kube-api-access-dwmks\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.995364 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-scripts\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.995507 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:08 crc kubenswrapper[4959]: I1007 13:18:08.995647 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96e0aa23-8c42-4616-af38-0eb612e5f181-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.098706 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.098782 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96e0aa23-8c42-4616-af38-0eb612e5f181-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.098863 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.098893 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96e0aa23-8c42-4616-af38-0eb612e5f181-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.098981 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.099008 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmks\" (UniqueName: \"kubernetes.io/projected/96e0aa23-8c42-4616-af38-0eb612e5f181-kube-api-access-dwmks\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.099031 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-scripts\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.102496 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.104299 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.107064 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-scripts\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.113512 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e0aa23-8c42-4616-af38-0eb612e5f181-config-data\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.116357 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmks\" (UniqueName: \"kubernetes.io/projected/96e0aa23-8c42-4616-af38-0eb612e5f181-kube-api-access-dwmks\") pod \"cinder-scheduler-0\" (UID: \"96e0aa23-8c42-4616-af38-0eb612e5f181\") " pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.218432 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.500907 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9np9s"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.502347 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.511921 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9np9s"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.599979 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-27bts"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.601268 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.622056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vlfg\" (UniqueName: \"kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg\") pod \"nova-api-db-create-9np9s\" (UID: \"65663a28-f7e8-430c-92e9-ba8e346b04ba\") " pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.647679 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-27bts"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.723675 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vlfg\" (UniqueName: \"kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg\") pod \"nova-api-db-create-9np9s\" (UID: \"65663a28-f7e8-430c-92e9-ba8e346b04ba\") " pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.723727 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbhgl\" (UniqueName: \"kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl\") pod \"nova-cell0-db-create-27bts\" (UID: \"e09add8f-d15e-47ea-83c3-8cd2512ae67a\") " pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.748363 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vlfg\" (UniqueName: \"kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg\") pod \"nova-api-db-create-9np9s\" (UID: \"65663a28-f7e8-430c-92e9-ba8e346b04ba\") " pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.803599 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rqlwl"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.805077 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.829944 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbhgl\" (UniqueName: \"kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl\") pod \"nova-cell0-db-create-27bts\" (UID: \"e09add8f-d15e-47ea-83c3-8cd2512ae67a\") " pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.838732 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rqlwl"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.861797 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.875548 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96e0aa23-8c42-4616-af38-0eb612e5f181","Type":"ContainerStarted","Data":"34a7384da99f54eaf6efcf71b1f73719402959bb8d49907805988ab6607b08a1"} Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.912498 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbhgl\" (UniqueName: \"kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl\") pod \"nova-cell0-db-create-27bts\" (UID: \"e09add8f-d15e-47ea-83c3-8cd2512ae67a\") " pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.918846 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.935338 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmnqm\" (UniqueName: \"kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm\") pod \"nova-cell1-db-create-rqlwl\" (UID: \"a87f137d-3e0f-423b-af71-2197ae7d9cf2\") " pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:09 crc kubenswrapper[4959]: I1007 13:18:09.956014 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.037661 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmnqm\" (UniqueName: \"kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm\") pod \"nova-cell1-db-create-rqlwl\" (UID: \"a87f137d-3e0f-423b-af71-2197ae7d9cf2\") " pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.062836 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmnqm\" (UniqueName: \"kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm\") pod \"nova-cell1-db-create-rqlwl\" (UID: \"a87f137d-3e0f-423b-af71-2197ae7d9cf2\") " pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.283577 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.410792 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9np9s"] Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.535788 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-27bts"] Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.559209 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.823211 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11336708-94ba-4b0c-b4c4-1f3bb24f44b0" path="/var/lib/kubelet/pods/11336708-94ba-4b0c-b4c4-1f3bb24f44b0/volumes" Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.842113 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rqlwl"] Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.903596 4959 generic.go:334] "Generic (PLEG): container finished" podID="e09add8f-d15e-47ea-83c3-8cd2512ae67a" containerID="0cc6a6b5660cbd3b85ab9a6de6de515f6e3526281b46c322674a33d2fb539da4" exitCode=0 Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.903656 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-27bts" event={"ID":"e09add8f-d15e-47ea-83c3-8cd2512ae67a","Type":"ContainerDied","Data":"0cc6a6b5660cbd3b85ab9a6de6de515f6e3526281b46c322674a33d2fb539da4"} Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.903703 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-27bts" event={"ID":"e09add8f-d15e-47ea-83c3-8cd2512ae67a","Type":"ContainerStarted","Data":"f1b2608e69abb9958490561f7993ecce09dd37faafde668a47053b0bca17117a"} Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.908340 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96e0aa23-8c42-4616-af38-0eb612e5f181","Type":"ContainerStarted","Data":"5b3f4c4ebed5197e81fa25e9618889ed40ce1d199530e7e0994613036b8f5416"} Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.910472 4959 generic.go:334] "Generic (PLEG): container finished" podID="65663a28-f7e8-430c-92e9-ba8e346b04ba" containerID="d3bf710636f8a743877ef42a6ba13f827b85d4ca2d916821a3add78c86983b6c" exitCode=0 Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.910510 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9np9s" event={"ID":"65663a28-f7e8-430c-92e9-ba8e346b04ba","Type":"ContainerDied","Data":"d3bf710636f8a743877ef42a6ba13f827b85d4ca2d916821a3add78c86983b6c"} Oct 07 13:18:10 crc kubenswrapper[4959]: I1007 13:18:10.910534 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9np9s" event={"ID":"65663a28-f7e8-430c-92e9-ba8e346b04ba","Type":"ContainerStarted","Data":"6759177f2b97ca5779134315ed9acbe407533387932d727d52c22427fdcf2b6d"} Oct 07 13:18:10 crc kubenswrapper[4959]: W1007 13:18:10.922864 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda87f137d_3e0f_423b_af71_2197ae7d9cf2.slice/crio-acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9 WatchSource:0}: Error finding container acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9: Status 404 returned error can't find the container with id acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9 Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.109954 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86cfbf9b4f-pxglw" Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.171269 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.171502 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-94d9575bd-dbgt6" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-api" containerID="cri-o://c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc" gracePeriod=30 Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.171908 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-94d9575bd-dbgt6" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-httpd" containerID="cri-o://e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018" gracePeriod=30 Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.782356 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.923333 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96e0aa23-8c42-4616-af38-0eb612e5f181","Type":"ContainerStarted","Data":"9184fd89cf1938e41b77691f82cf86a0e4b58849a0948fc99021aaa41ce5bcb7"} Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.941299 4959 generic.go:334] "Generic (PLEG): container finished" podID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerID="e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018" exitCode=0 Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.941362 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerDied","Data":"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018"} Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.958878 4959 generic.go:334] "Generic (PLEG): container finished" podID="a87f137d-3e0f-423b-af71-2197ae7d9cf2" containerID="5c2721ebba6fc60a7a04d1082da2d836990e7c75a046a89409eb55ee38ce5e2b" exitCode=0 Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.959464 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqlwl" event={"ID":"a87f137d-3e0f-423b-af71-2197ae7d9cf2","Type":"ContainerDied","Data":"5c2721ebba6fc60a7a04d1082da2d836990e7c75a046a89409eb55ee38ce5e2b"} Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.959505 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqlwl" event={"ID":"a87f137d-3e0f-423b-af71-2197ae7d9cf2","Type":"ContainerStarted","Data":"acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9"} Oct 07 13:18:11 crc kubenswrapper[4959]: I1007 13:18:11.980608 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.980587547 podStartE2EDuration="3.980587547s" podCreationTimestamp="2025-10-07 13:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:18:11.951091624 +0000 UTC m=+1044.111814301" watchObservedRunningTime="2025-10-07 13:18:11.980587547 +0000 UTC m=+1044.141310224" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.429953 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.434264 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.489945 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbhgl\" (UniqueName: \"kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl\") pod \"e09add8f-d15e-47ea-83c3-8cd2512ae67a\" (UID: \"e09add8f-d15e-47ea-83c3-8cd2512ae67a\") " Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.489997 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vlfg\" (UniqueName: \"kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg\") pod \"65663a28-f7e8-430c-92e9-ba8e346b04ba\" (UID: \"65663a28-f7e8-430c-92e9-ba8e346b04ba\") " Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.496235 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl" (OuterVolumeSpecName: "kube-api-access-nbhgl") pod "e09add8f-d15e-47ea-83c3-8cd2512ae67a" (UID: "e09add8f-d15e-47ea-83c3-8cd2512ae67a"). InnerVolumeSpecName "kube-api-access-nbhgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.496278 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg" (OuterVolumeSpecName: "kube-api-access-2vlfg") pod "65663a28-f7e8-430c-92e9-ba8e346b04ba" (UID: "65663a28-f7e8-430c-92e9-ba8e346b04ba"). InnerVolumeSpecName "kube-api-access-2vlfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.592466 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbhgl\" (UniqueName: \"kubernetes.io/projected/e09add8f-d15e-47ea-83c3-8cd2512ae67a-kube-api-access-nbhgl\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.592509 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vlfg\" (UniqueName: \"kubernetes.io/projected/65663a28-f7e8-430c-92e9-ba8e346b04ba-kube-api-access-2vlfg\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.982610 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9np9s" event={"ID":"65663a28-f7e8-430c-92e9-ba8e346b04ba","Type":"ContainerDied","Data":"6759177f2b97ca5779134315ed9acbe407533387932d727d52c22427fdcf2b6d"} Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.982872 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6759177f2b97ca5779134315ed9acbe407533387932d727d52c22427fdcf2b6d" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.982761 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9np9s" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.986953 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-27bts" Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.987449 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-27bts" event={"ID":"e09add8f-d15e-47ea-83c3-8cd2512ae67a","Type":"ContainerDied","Data":"f1b2608e69abb9958490561f7993ecce09dd37faafde668a47053b0bca17117a"} Oct 07 13:18:12 crc kubenswrapper[4959]: I1007 13:18:12.987523 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b2608e69abb9958490561f7993ecce09dd37faafde668a47053b0bca17117a" Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.283992 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.307933 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmnqm\" (UniqueName: \"kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm\") pod \"a87f137d-3e0f-423b-af71-2197ae7d9cf2\" (UID: \"a87f137d-3e0f-423b-af71-2197ae7d9cf2\") " Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.313270 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm" (OuterVolumeSpecName: "kube-api-access-wmnqm") pod "a87f137d-3e0f-423b-af71-2197ae7d9cf2" (UID: "a87f137d-3e0f-423b-af71-2197ae7d9cf2"). InnerVolumeSpecName "kube-api-access-wmnqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.409880 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmnqm\" (UniqueName: \"kubernetes.io/projected/a87f137d-3e0f-423b-af71-2197ae7d9cf2-kube-api-access-wmnqm\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.997155 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqlwl" event={"ID":"a87f137d-3e0f-423b-af71-2197ae7d9cf2","Type":"ContainerDied","Data":"acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9"} Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.997400 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc8527e9cb6e4966d46717039478f875bbc7adcfcd103de3ad053f10df878f9" Oct 07 13:18:13 crc kubenswrapper[4959]: I1007 13:18:13.997208 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqlwl" Oct 07 13:18:14 crc kubenswrapper[4959]: I1007 13:18:14.219012 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.388183 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.388767 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-central-agent" containerID="cri-o://701fbbcbc18dbd963c5084b28171f71652f596e741de83b02f9a863893d3b4db" gracePeriod=30 Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.389192 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="sg-core" containerID="cri-o://4ac2f6266ebca71342c7f640379bd3461a164eef8ef4b9abadea2be8d0c1f7db" gracePeriod=30 Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.389239 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-notification-agent" containerID="cri-o://d59472bd3244d672cb428691586ba5569242112c50c4c389c596c40876b9685d" gracePeriod=30 Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.389181 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="proxy-httpd" containerID="cri-o://feb43f9bc0ea194654d03a90a38be34787fbfbd0b13ee4d3b87facdbc413fc01" gracePeriod=30 Oct 07 13:18:15 crc kubenswrapper[4959]: I1007 13:18:15.394766 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.023930 4959 generic.go:334] "Generic (PLEG): container finished" podID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerID="feb43f9bc0ea194654d03a90a38be34787fbfbd0b13ee4d3b87facdbc413fc01" exitCode=0 Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.024206 4959 generic.go:334] "Generic (PLEG): container finished" podID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerID="4ac2f6266ebca71342c7f640379bd3461a164eef8ef4b9abadea2be8d0c1f7db" exitCode=2 Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.024285 4959 generic.go:334] "Generic (PLEG): container finished" podID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerID="701fbbcbc18dbd963c5084b28171f71652f596e741de83b02f9a863893d3b4db" exitCode=0 Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.024366 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerDied","Data":"feb43f9bc0ea194654d03a90a38be34787fbfbd0b13ee4d3b87facdbc413fc01"} Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.024466 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerDied","Data":"4ac2f6266ebca71342c7f640379bd3461a164eef8ef4b9abadea2be8d0c1f7db"} Oct 07 13:18:16 crc kubenswrapper[4959]: I1007 13:18:16.024580 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerDied","Data":"701fbbcbc18dbd963c5084b28171f71652f596e741de83b02f9a863893d3b4db"} Oct 07 13:18:18 crc kubenswrapper[4959]: I1007 13:18:18.048641 4959 generic.go:334] "Generic (PLEG): container finished" podID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerID="d59472bd3244d672cb428691586ba5569242112c50c4c389c596c40876b9685d" exitCode=0 Oct 07 13:18:18 crc kubenswrapper[4959]: I1007 13:18:18.048665 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerDied","Data":"d59472bd3244d672cb428691586ba5569242112c50c4c389c596c40876b9685d"} Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.475037 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.637987 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.747606 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748074 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748172 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748193 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748285 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748321 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttjnt\" (UniqueName: \"kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.748347 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data\") pod \"17188cfd-fbac-49d7-86a0-b61de72bd81c\" (UID: \"17188cfd-fbac-49d7-86a0-b61de72bd81c\") " Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.751065 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.752036 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.758142 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt" (OuterVolumeSpecName: "kube-api-access-ttjnt") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "kube-api-access-ttjnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.760127 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts" (OuterVolumeSpecName: "scripts") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.767563 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8eca-account-create-ppk85"] Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768036 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09add8f-d15e-47ea-83c3-8cd2512ae67a" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768050 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09add8f-d15e-47ea-83c3-8cd2512ae67a" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768063 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="sg-core" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768069 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="sg-core" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768079 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87f137d-3e0f-423b-af71-2197ae7d9cf2" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768086 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87f137d-3e0f-423b-af71-2197ae7d9cf2" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768116 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="proxy-httpd" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768123 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="proxy-httpd" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768134 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65663a28-f7e8-430c-92e9-ba8e346b04ba" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768140 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="65663a28-f7e8-430c-92e9-ba8e346b04ba" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768153 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-central-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768159 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-central-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: E1007 13:18:19.768189 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-notification-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768194 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-notification-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768377 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="sg-core" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768390 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-central-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768419 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="ceilometer-notification-agent" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768432 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="65663a28-f7e8-430c-92e9-ba8e346b04ba" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768447 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87f137d-3e0f-423b-af71-2197ae7d9cf2" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768454 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09add8f-d15e-47ea-83c3-8cd2512ae67a" containerName="mariadb-database-create" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.768463 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" containerName="proxy-httpd" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.769141 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.771594 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.775566 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8eca-account-create-ppk85"] Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.803954 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853532 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjmlj\" (UniqueName: \"kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj\") pod \"nova-api-8eca-account-create-ppk85\" (UID: \"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c\") " pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853756 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853841 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttjnt\" (UniqueName: \"kubernetes.io/projected/17188cfd-fbac-49d7-86a0-b61de72bd81c-kube-api-access-ttjnt\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853873 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853899 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.853911 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17188cfd-fbac-49d7-86a0-b61de72bd81c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.860056 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.895833 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data" (OuterVolumeSpecName: "config-data") pod "17188cfd-fbac-49d7-86a0-b61de72bd81c" (UID: "17188cfd-fbac-49d7-86a0-b61de72bd81c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.955472 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjmlj\" (UniqueName: \"kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj\") pod \"nova-api-8eca-account-create-ppk85\" (UID: \"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c\") " pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.955574 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.955585 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17188cfd-fbac-49d7-86a0-b61de72bd81c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.957440 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3a92-account-create-blh5r"] Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.960742 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.963569 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 13:18:19 crc kubenswrapper[4959]: I1007 13:18:19.995465 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjmlj\" (UniqueName: \"kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj\") pod \"nova-api-8eca-account-create-ppk85\" (UID: \"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c\") " pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.003923 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3a92-account-create-blh5r"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.057075 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxb9n\" (UniqueName: \"kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n\") pod \"nova-cell0-3a92-account-create-blh5r\" (UID: \"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7\") " pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.066776 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17188cfd-fbac-49d7-86a0-b61de72bd81c","Type":"ContainerDied","Data":"8520c5b7d83774c480837419bff1da19e878a1fab6132961602c77a927ec5693"} Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.066844 4959 scope.go:117] "RemoveContainer" containerID="feb43f9bc0ea194654d03a90a38be34787fbfbd0b13ee4d3b87facdbc413fc01" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.066971 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.076103 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eab0abf5-c944-4a5c-9259-6dc0ea2b115f","Type":"ContainerStarted","Data":"6f70c1cd19665f09aeaa0b950cecf5f1cd28b386220e5271966ce4101ebfec41"} Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.090271 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.095214 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.567596157 podStartE2EDuration="14.095197954s" podCreationTimestamp="2025-10-07 13:18:06 +0000 UTC" firstStartedPulling="2025-10-07 13:18:07.840154701 +0000 UTC m=+1040.000877378" lastFinishedPulling="2025-10-07 13:18:19.367756488 +0000 UTC m=+1051.528479175" observedRunningTime="2025-10-07 13:18:20.08756339 +0000 UTC m=+1052.248286067" watchObservedRunningTime="2025-10-07 13:18:20.095197954 +0000 UTC m=+1052.255920631" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.102428 4959 scope.go:117] "RemoveContainer" containerID="4ac2f6266ebca71342c7f640379bd3461a164eef8ef4b9abadea2be8d0c1f7db" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.123208 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.134281 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.159243 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.161886 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.164599 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxb9n\" (UniqueName: \"kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n\") pod \"nova-cell0-3a92-account-create-blh5r\" (UID: \"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7\") " pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.166291 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.167815 4959 scope.go:117] "RemoveContainer" containerID="d59472bd3244d672cb428691586ba5569242112c50c4c389c596c40876b9685d" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.168057 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.175239 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b8a0-account-create-wdlvk"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.176384 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.181501 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.193708 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.204964 4959 scope.go:117] "RemoveContainer" containerID="701fbbcbc18dbd963c5084b28171f71652f596e741de83b02f9a863893d3b4db" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.205871 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxb9n\" (UniqueName: \"kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n\") pod \"nova-cell0-3a92-account-create-blh5r\" (UID: \"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7\") " pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.207593 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b8a0-account-create-wdlvk"] Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.266835 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267227 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4r5\" (UniqueName: \"kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5\") pod \"nova-cell1-b8a0-account-create-wdlvk\" (UID: \"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde\") " pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267298 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267342 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267364 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267392 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267500 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgz2q\" (UniqueName: \"kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.267529 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.288173 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.368835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4r5\" (UniqueName: \"kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5\") pod \"nova-cell1-b8a0-account-create-wdlvk\" (UID: \"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde\") " pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.368946 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.368973 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.369939 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.369985 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.370019 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.370279 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.370371 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz2q\" (UniqueName: \"kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.370406 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.370454 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.373449 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.376070 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.376504 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.387170 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgz2q\" (UniqueName: \"kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.388042 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts\") pod \"ceilometer-0\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.390225 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4r5\" (UniqueName: \"kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5\") pod \"nova-cell1-b8a0-account-create-wdlvk\" (UID: \"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde\") " pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.484587 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.514254 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.568017 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c99fbb6b6-2j7rt" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.699923 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8eca-account-create-ppk85"] Oct 07 13:18:20 crc kubenswrapper[4959]: W1007 13:18:20.722584 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda53ee4c4_47e3_4213_bbe8_6d2f92f28a4c.slice/crio-1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36 WatchSource:0}: Error finding container 1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36: Status 404 returned error can't find the container with id 1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36 Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.829540 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17188cfd-fbac-49d7-86a0-b61de72bd81c" path="/var/lib/kubelet/pods/17188cfd-fbac-49d7-86a0-b61de72bd81c/volumes" Oct 07 13:18:20 crc kubenswrapper[4959]: I1007 13:18:20.869313 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3a92-account-create-blh5r"] Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.013868 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:21 crc kubenswrapper[4959]: W1007 13:18:21.048024 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda157df71_5511_4702_9fae_b4e2c5e69d52.slice/crio-9662af75f62ddb7853f442ccb7058c97b4cba704854d7d46453d454e4145ef45 WatchSource:0}: Error finding container 9662af75f62ddb7853f442ccb7058c97b4cba704854d7d46453d454e4145ef45: Status 404 returned error can't find the container with id 9662af75f62ddb7853f442ccb7058c97b4cba704854d7d46453d454e4145ef45 Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.087008 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerStarted","Data":"9662af75f62ddb7853f442ccb7058c97b4cba704854d7d46453d454e4145ef45"} Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.092179 4959 generic.go:334] "Generic (PLEG): container finished" podID="a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" containerID="19bc1701a6c81a9ee4bf8dce3abf860b68373c0e4a1cecbccb882db2a9b6a3f9" exitCode=0 Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.092245 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8eca-account-create-ppk85" event={"ID":"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c","Type":"ContainerDied","Data":"19bc1701a6c81a9ee4bf8dce3abf860b68373c0e4a1cecbccb882db2a9b6a3f9"} Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.092272 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8eca-account-create-ppk85" event={"ID":"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c","Type":"ContainerStarted","Data":"1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36"} Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.096825 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a92-account-create-blh5r" event={"ID":"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7","Type":"ContainerStarted","Data":"c24041a679ee6d16452c6a4693b6f61d688659e2f4f8aab5fda71aba8ce9257e"} Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.109307 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b8a0-account-create-wdlvk"] Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.880484 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.895298 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle\") pod \"e4e49e10-6ce8-4bde-b15c-141fe2479574\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.895596 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h2xg\" (UniqueName: \"kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg\") pod \"e4e49e10-6ce8-4bde-b15c-141fe2479574\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.895641 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config\") pod \"e4e49e10-6ce8-4bde-b15c-141fe2479574\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.895678 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs\") pod \"e4e49e10-6ce8-4bde-b15c-141fe2479574\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.895733 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config\") pod \"e4e49e10-6ce8-4bde-b15c-141fe2479574\" (UID: \"e4e49e10-6ce8-4bde-b15c-141fe2479574\") " Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.902808 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg" (OuterVolumeSpecName: "kube-api-access-9h2xg") pod "e4e49e10-6ce8-4bde-b15c-141fe2479574" (UID: "e4e49e10-6ce8-4bde-b15c-141fe2479574"). InnerVolumeSpecName "kube-api-access-9h2xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.921942 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e4e49e10-6ce8-4bde-b15c-141fe2479574" (UID: "e4e49e10-6ce8-4bde-b15c-141fe2479574"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.985777 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e49e10-6ce8-4bde-b15c-141fe2479574" (UID: "e4e49e10-6ce8-4bde-b15c-141fe2479574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.991177 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config" (OuterVolumeSpecName: "config") pod "e4e49e10-6ce8-4bde-b15c-141fe2479574" (UID: "e4e49e10-6ce8-4bde-b15c-141fe2479574"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.997211 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.997248 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h2xg\" (UniqueName: \"kubernetes.io/projected/e4e49e10-6ce8-4bde-b15c-141fe2479574-kube-api-access-9h2xg\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.997260 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:21 crc kubenswrapper[4959]: I1007 13:18:21.997271 4959 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.018827 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e4e49e10-6ce8-4bde-b15c-141fe2479574" (UID: "e4e49e10-6ce8-4bde-b15c-141fe2479574"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.099054 4959 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e49e10-6ce8-4bde-b15c-141fe2479574-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.109936 4959 generic.go:334] "Generic (PLEG): container finished" podID="4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" containerID="b045ffcddbcdb73ecdef699772ca7c33fc1967b028c73276ea054786e510923c" exitCode=0 Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.110007 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a92-account-create-blh5r" event={"ID":"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7","Type":"ContainerDied","Data":"b045ffcddbcdb73ecdef699772ca7c33fc1967b028c73276ea054786e510923c"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.112286 4959 generic.go:334] "Generic (PLEG): container finished" podID="1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" containerID="37ca17308a328c74235682edcbd7f343a853b9cde8f257b6b5232ec1c02d9adb" exitCode=0 Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.112389 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" event={"ID":"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde","Type":"ContainerDied","Data":"37ca17308a328c74235682edcbd7f343a853b9cde8f257b6b5232ec1c02d9adb"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.112436 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" event={"ID":"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde","Type":"ContainerStarted","Data":"6747efd81a24b68b133df93ac967159a148d608f668819781dc4b3406beea5ce"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.114978 4959 generic.go:334] "Generic (PLEG): container finished" podID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerID="c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc" exitCode=0 Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.115051 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerDied","Data":"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.115088 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94d9575bd-dbgt6" event={"ID":"e4e49e10-6ce8-4bde-b15c-141fe2479574","Type":"ContainerDied","Data":"94a32e3fc7ddebda4df383dbb4c8cb3984d8ef994f29c660e20cd45ce17c6058"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.115109 4959 scope.go:117] "RemoveContainer" containerID="e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.115224 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94d9575bd-dbgt6" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.118890 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerStarted","Data":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.177221 4959 scope.go:117] "RemoveContainer" containerID="c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.194577 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.202529 4959 scope.go:117] "RemoveContainer" containerID="e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018" Oct 07 13:18:22 crc kubenswrapper[4959]: E1007 13:18:22.203003 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018\": container with ID starting with e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018 not found: ID does not exist" containerID="e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.203050 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018"} err="failed to get container status \"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018\": rpc error: code = NotFound desc = could not find container \"e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018\": container with ID starting with e4c95759aa0a69569e467b140ae00aee2bbc78c98280995de232a6ea82101018 not found: ID does not exist" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.203079 4959 scope.go:117] "RemoveContainer" containerID="c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc" Oct 07 13:18:22 crc kubenswrapper[4959]: E1007 13:18:22.203322 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc\": container with ID starting with c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc not found: ID does not exist" containerID="c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.203375 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc"} err="failed to get container status \"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc\": rpc error: code = NotFound desc = could not find container \"c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc\": container with ID starting with c15797384fa3ae35d43f6a6ebd27ad925b12a2e9c1b83e030b78911a2f44f8cc not found: ID does not exist" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.209231 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-94d9575bd-dbgt6"] Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.609568 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.614321 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.810178 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjmlj\" (UniqueName: \"kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj\") pod \"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c\" (UID: \"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c\") " Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.817454 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj" (OuterVolumeSpecName: "kube-api-access-vjmlj") pod "a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" (UID: "a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c"). InnerVolumeSpecName "kube-api-access-vjmlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.820439 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" path="/var/lib/kubelet/pods/e4e49e10-6ce8-4bde-b15c-141fe2479574/volumes" Oct 07 13:18:22 crc kubenswrapper[4959]: I1007 13:18:22.913280 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjmlj\" (UniqueName: \"kubernetes.io/projected/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c-kube-api-access-vjmlj\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.136116 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerStarted","Data":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.137838 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8eca-account-create-ppk85" event={"ID":"a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c","Type":"ContainerDied","Data":"1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36"} Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.137869 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1799385b83edac6017d7d2421b3b30147a5669781e2144986fe7113ba2433b36" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.137930 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8eca-account-create-ppk85" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.596197 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.604107 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.725592 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxb9n\" (UniqueName: \"kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n\") pod \"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7\" (UID: \"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7\") " Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.726018 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc4r5\" (UniqueName: \"kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5\") pod \"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde\" (UID: \"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde\") " Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.732117 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n" (OuterVolumeSpecName: "kube-api-access-sxb9n") pod "4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" (UID: "4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7"). InnerVolumeSpecName "kube-api-access-sxb9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.749337 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5" (OuterVolumeSpecName: "kube-api-access-vc4r5") pod "1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" (UID: "1abd2ce0-24fb-4a3e-abce-ab6e4a693cde"). InnerVolumeSpecName "kube-api-access-vc4r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.827933 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxb9n\" (UniqueName: \"kubernetes.io/projected/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7-kube-api-access-sxb9n\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:23 crc kubenswrapper[4959]: I1007 13:18:23.827963 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc4r5\" (UniqueName: \"kubernetes.io/projected/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde-kube-api-access-vc4r5\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.152703 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.152708 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b8a0-account-create-wdlvk" event={"ID":"1abd2ce0-24fb-4a3e-abce-ab6e4a693cde","Type":"ContainerDied","Data":"6747efd81a24b68b133df93ac967159a148d608f668819781dc4b3406beea5ce"} Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.154993 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6747efd81a24b68b133df93ac967159a148d608f668819781dc4b3406beea5ce" Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.155155 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerStarted","Data":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.156230 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a92-account-create-blh5r" event={"ID":"4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7","Type":"ContainerDied","Data":"c24041a679ee6d16452c6a4693b6f61d688659e2f4f8aab5fda71aba8ce9257e"} Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.156279 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c24041a679ee6d16452c6a4693b6f61d688659e2f4f8aab5fda71aba8ce9257e" Oct 07 13:18:24 crc kubenswrapper[4959]: I1007 13:18:24.156592 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a92-account-create-blh5r" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.181512 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xphbd"] Oct 07 13:18:25 crc kubenswrapper[4959]: E1007 13:18:25.182399 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-httpd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182416 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-httpd" Oct 07 13:18:25 crc kubenswrapper[4959]: E1007 13:18:25.182428 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182435 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: E1007 13:18:25.182448 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182455 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: E1007 13:18:25.182469 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182477 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: E1007 13:18:25.182500 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-api" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182507 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-api" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182676 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-httpd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182688 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182702 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e49e10-6ce8-4bde-b15c-141fe2479574" containerName="neutron-api" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182717 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.182726 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" containerName="mariadb-account-create" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.183284 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.186072 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z6zp7" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.186453 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.186665 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.193086 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xphbd"] Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.356138 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.356191 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.356244 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxx4\" (UniqueName: \"kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.356313 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.458071 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.458124 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.458177 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxx4\" (UniqueName: \"kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.458235 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.465508 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.466309 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.483167 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.486313 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxx4\" (UniqueName: \"kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4\") pod \"nova-cell0-conductor-db-sync-xphbd\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.522224 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:25 crc kubenswrapper[4959]: I1007 13:18:25.994681 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xphbd"] Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.065988 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170512 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170609 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170654 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170833 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170879 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170925 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbbk6\" (UniqueName: \"kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.170952 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key\") pod \"468d7f11-8929-410e-a59d-1f78cc33a279\" (UID: \"468d7f11-8929-410e-a59d-1f78cc33a279\") " Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.171603 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs" (OuterVolumeSpecName: "logs") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.172717 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xphbd" event={"ID":"cdeea2ce-cd7b-4608-8dc8-bab322fc76db","Type":"ContainerStarted","Data":"94190892a115a6250d55024970b8ac547b2b17efe390995950a934fe4f3268c4"} Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.174743 4959 generic.go:334] "Generic (PLEG): container finished" podID="468d7f11-8929-410e-a59d-1f78cc33a279" containerID="9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9" exitCode=137 Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.174788 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerDied","Data":"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9"} Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.174817 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c99fbb6b6-2j7rt" event={"ID":"468d7f11-8929-410e-a59d-1f78cc33a279","Type":"ContainerDied","Data":"67ff2c5f24ba6b1134361b471278457ba91b1bdff023542e5b362f748119eea9"} Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.174838 4959 scope.go:117] "RemoveContainer" containerID="52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.174858 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c99fbb6b6-2j7rt" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.175786 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6" (OuterVolumeSpecName: "kube-api-access-lbbk6") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "kube-api-access-lbbk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.176328 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.195242 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data" (OuterVolumeSpecName: "config-data") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.198119 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.210123 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts" (OuterVolumeSpecName: "scripts") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.223423 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "468d7f11-8929-410e-a59d-1f78cc33a279" (UID: "468d7f11-8929-410e-a59d-1f78cc33a279"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272374 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468d7f11-8929-410e-a59d-1f78cc33a279-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272408 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272418 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbbk6\" (UniqueName: \"kubernetes.io/projected/468d7f11-8929-410e-a59d-1f78cc33a279-kube-api-access-lbbk6\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272428 4959 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272438 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272446 4959 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d7f11-8929-410e-a59d-1f78cc33a279-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.272456 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468d7f11-8929-410e-a59d-1f78cc33a279-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.364866 4959 scope.go:117] "RemoveContainer" containerID="9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.386239 4959 scope.go:117] "RemoveContainer" containerID="52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617" Oct 07 13:18:26 crc kubenswrapper[4959]: E1007 13:18:26.387011 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617\": container with ID starting with 52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617 not found: ID does not exist" containerID="52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.387055 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617"} err="failed to get container status \"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617\": rpc error: code = NotFound desc = could not find container \"52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617\": container with ID starting with 52b15c925bec5f466911bc5bda1d856ad826147c584361942992b043df532617 not found: ID does not exist" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.387082 4959 scope.go:117] "RemoveContainer" containerID="9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9" Oct 07 13:18:26 crc kubenswrapper[4959]: E1007 13:18:26.387307 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9\": container with ID starting with 9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9 not found: ID does not exist" containerID="9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.387336 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9"} err="failed to get container status \"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9\": rpc error: code = NotFound desc = could not find container \"9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9\": container with ID starting with 9e9d3b4802cb4c5cde4cf6f6946b6a0b1c188be662328c417813c9f812ffc9d9 not found: ID does not exist" Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.608905 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.616982 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c99fbb6b6-2j7rt"] Oct 07 13:18:26 crc kubenswrapper[4959]: I1007 13:18:26.825231 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" path="/var/lib/kubelet/pods/468d7f11-8929-410e-a59d-1f78cc33a279/volumes" Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.187859 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerStarted","Data":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.188138 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.187986 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-central-agent" containerID="cri-o://2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" gracePeriod=30 Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.188211 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="proxy-httpd" containerID="cri-o://d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" gracePeriod=30 Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.188281 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-notification-agent" containerID="cri-o://da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" gracePeriod=30 Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.188288 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="sg-core" containerID="cri-o://5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" gracePeriod=30 Oct 07 13:18:27 crc kubenswrapper[4959]: I1007 13:18:27.207821 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.84925631 podStartE2EDuration="7.2078017s" podCreationTimestamp="2025-10-07 13:18:20 +0000 UTC" firstStartedPulling="2025-10-07 13:18:21.050355363 +0000 UTC m=+1053.211078040" lastFinishedPulling="2025-10-07 13:18:26.408900753 +0000 UTC m=+1058.569623430" observedRunningTime="2025-10-07 13:18:27.206286856 +0000 UTC m=+1059.367009543" watchObservedRunningTime="2025-10-07 13:18:27.2078017 +0000 UTC m=+1059.368524397" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.100984 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113104 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113265 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113327 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgz2q\" (UniqueName: \"kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113366 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113463 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113514 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.113530 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd\") pod \"a157df71-5511-4702-9fae-b4e2c5e69d52\" (UID: \"a157df71-5511-4702-9fae-b4e2c5e69d52\") " Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.114663 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.118548 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.121001 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q" (OuterVolumeSpecName: "kube-api-access-sgz2q") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "kube-api-access-sgz2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.126935 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts" (OuterVolumeSpecName: "scripts") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.146529 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199202 4959 generic.go:334] "Generic (PLEG): container finished" podID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" exitCode=0 Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199241 4959 generic.go:334] "Generic (PLEG): container finished" podID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" exitCode=2 Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199263 4959 generic.go:334] "Generic (PLEG): container finished" podID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" exitCode=0 Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199276 4959 generic.go:334] "Generic (PLEG): container finished" podID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" exitCode=0 Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199298 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerDied","Data":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199327 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerDied","Data":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199341 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerDied","Data":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199352 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerDied","Data":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199362 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a157df71-5511-4702-9fae-b4e2c5e69d52","Type":"ContainerDied","Data":"9662af75f62ddb7853f442ccb7058c97b4cba704854d7d46453d454e4145ef45"} Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199379 4959 scope.go:117] "RemoveContainer" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.199517 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.204267 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215514 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215539 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215548 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215558 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215567 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgz2q\" (UniqueName: \"kubernetes.io/projected/a157df71-5511-4702-9fae-b4e2c5e69d52-kube-api-access-sgz2q\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215642 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a157df71-5511-4702-9fae-b4e2c5e69d52-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.215799 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data" (OuterVolumeSpecName: "config-data") pod "a157df71-5511-4702-9fae-b4e2c5e69d52" (UID: "a157df71-5511-4702-9fae-b4e2c5e69d52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.218140 4959 scope.go:117] "RemoveContainer" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.235486 4959 scope.go:117] "RemoveContainer" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.252600 4959 scope.go:117] "RemoveContainer" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.270992 4959 scope.go:117] "RemoveContainer" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.271470 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": container with ID starting with d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99 not found: ID does not exist" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.271529 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} err="failed to get container status \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": rpc error: code = NotFound desc = could not find container \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": container with ID starting with d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.271565 4959 scope.go:117] "RemoveContainer" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.272061 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": container with ID starting with 5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95 not found: ID does not exist" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.272104 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} err="failed to get container status \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": rpc error: code = NotFound desc = could not find container \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": container with ID starting with 5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.272134 4959 scope.go:117] "RemoveContainer" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.272451 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": container with ID starting with da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206 not found: ID does not exist" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.272484 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} err="failed to get container status \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": rpc error: code = NotFound desc = could not find container \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": container with ID starting with da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.272506 4959 scope.go:117] "RemoveContainer" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.272993 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": container with ID starting with 2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f not found: ID does not exist" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273028 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} err="failed to get container status \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": rpc error: code = NotFound desc = could not find container \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": container with ID starting with 2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273098 4959 scope.go:117] "RemoveContainer" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273518 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} err="failed to get container status \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": rpc error: code = NotFound desc = could not find container \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": container with ID starting with d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273552 4959 scope.go:117] "RemoveContainer" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273922 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} err="failed to get container status \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": rpc error: code = NotFound desc = could not find container \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": container with ID starting with 5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.273952 4959 scope.go:117] "RemoveContainer" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.274313 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} err="failed to get container status \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": rpc error: code = NotFound desc = could not find container \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": container with ID starting with da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.274352 4959 scope.go:117] "RemoveContainer" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.274761 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} err="failed to get container status \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": rpc error: code = NotFound desc = could not find container \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": container with ID starting with 2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.274788 4959 scope.go:117] "RemoveContainer" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275008 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} err="failed to get container status \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": rpc error: code = NotFound desc = could not find container \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": container with ID starting with d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275026 4959 scope.go:117] "RemoveContainer" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275223 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} err="failed to get container status \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": rpc error: code = NotFound desc = could not find container \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": container with ID starting with 5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275241 4959 scope.go:117] "RemoveContainer" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275412 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} err="failed to get container status \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": rpc error: code = NotFound desc = could not find container \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": container with ID starting with da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275438 4959 scope.go:117] "RemoveContainer" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275796 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} err="failed to get container status \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": rpc error: code = NotFound desc = could not find container \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": container with ID starting with 2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.275820 4959 scope.go:117] "RemoveContainer" containerID="d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276047 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99"} err="failed to get container status \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": rpc error: code = NotFound desc = could not find container \"d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99\": container with ID starting with d45ea568e7fe523bf6d93daf78c5c304bedacf14a305844dfd71b114b8caef99 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276067 4959 scope.go:117] "RemoveContainer" containerID="5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276309 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95"} err="failed to get container status \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": rpc error: code = NotFound desc = could not find container \"5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95\": container with ID starting with 5ea60dd6b92bb0875019d645f59dbdec87918b552e1babeac1fb639aa2e94a95 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276327 4959 scope.go:117] "RemoveContainer" containerID="da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276641 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206"} err="failed to get container status \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": rpc error: code = NotFound desc = could not find container \"da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206\": container with ID starting with da88a138ac14f7b98a94e69a029c0e9b1171bf51992ed903cc0e07fd0e882206 not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276662 4959 scope.go:117] "RemoveContainer" containerID="2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.276938 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f"} err="failed to get container status \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": rpc error: code = NotFound desc = could not find container \"2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f\": container with ID starting with 2cdee574d4dcee6be7f2991b8c25de50dfd43602a56b6bcedc5b7265793e905f not found: ID does not exist" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.317641 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a157df71-5511-4702-9fae-b4e2c5e69d52-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.540751 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.558908 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570029 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570378 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon-log" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570390 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon-log" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570406 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-central-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570412 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-central-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570425 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="proxy-httpd" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570430 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="proxy-httpd" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570442 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-notification-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570448 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-notification-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570467 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570473 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" Oct 07 13:18:28 crc kubenswrapper[4959]: E1007 13:18:28.570490 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="sg-core" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570496 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="sg-core" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570655 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="sg-core" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570668 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon-log" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570676 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-central-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570688 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="ceilometer-notification-agent" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570705 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" containerName="proxy-httpd" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.570714 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="468d7f11-8929-410e-a59d-1f78cc33a279" containerName="horizon" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.572397 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.579330 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.584137 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.592874 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632166 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632320 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632350 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632436 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrh8\" (UniqueName: \"kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632458 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632476 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.632489 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734029 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxrh8\" (UniqueName: \"kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734413 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734479 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734496 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734818 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734864 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.734999 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.735853 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.735999 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.738887 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.738958 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.741386 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.751302 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.752460 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxrh8\" (UniqueName: \"kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8\") pod \"ceilometer-0\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " pod="openstack/ceilometer-0" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.820997 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a157df71-5511-4702-9fae-b4e2c5e69d52" path="/var/lib/kubelet/pods/a157df71-5511-4702-9fae-b4e2c5e69d52/volumes" Oct 07 13:18:28 crc kubenswrapper[4959]: I1007 13:18:28.887689 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:18:33 crc kubenswrapper[4959]: I1007 13:18:33.500914 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:18:34 crc kubenswrapper[4959]: I1007 13:18:34.255555 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerStarted","Data":"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1"} Oct 07 13:18:34 crc kubenswrapper[4959]: I1007 13:18:34.255994 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerStarted","Data":"82ae37bb2056f863e9b1787e766ee0f5653c13078d96e8fe038fe5a04937f60c"} Oct 07 13:18:34 crc kubenswrapper[4959]: I1007 13:18:34.265361 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xphbd" event={"ID":"cdeea2ce-cd7b-4608-8dc8-bab322fc76db","Type":"ContainerStarted","Data":"01acbc9e010509d954b70d866b7db920dd22a35b9badcd858e2f3b93f378d348"} Oct 07 13:18:34 crc kubenswrapper[4959]: I1007 13:18:34.286065 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-xphbd" podStartSLOduration=2.152723738 podStartE2EDuration="9.286048461s" podCreationTimestamp="2025-10-07 13:18:25 +0000 UTC" firstStartedPulling="2025-10-07 13:18:25.997034991 +0000 UTC m=+1058.157757668" lastFinishedPulling="2025-10-07 13:18:33.130359714 +0000 UTC m=+1065.291082391" observedRunningTime="2025-10-07 13:18:34.27883726 +0000 UTC m=+1066.439559947" watchObservedRunningTime="2025-10-07 13:18:34.286048461 +0000 UTC m=+1066.446771138" Oct 07 13:18:36 crc kubenswrapper[4959]: I1007 13:18:36.283089 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerStarted","Data":"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004"} Oct 07 13:18:37 crc kubenswrapper[4959]: I1007 13:18:37.293289 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerStarted","Data":"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f"} Oct 07 13:18:38 crc kubenswrapper[4959]: I1007 13:18:38.842411 4959 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod11336708-94ba-4b0c-b4c4-1f3bb24f44b0"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod11336708-94ba-4b0c-b4c4-1f3bb24f44b0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod11336708_94ba_4b0c_b4c4_1f3bb24f44b0.slice" Oct 07 13:18:41 crc kubenswrapper[4959]: I1007 13:18:41.331181 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerStarted","Data":"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8"} Oct 07 13:18:42 crc kubenswrapper[4959]: I1007 13:18:42.339272 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:18:42 crc kubenswrapper[4959]: I1007 13:18:42.367274 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.162210453 podStartE2EDuration="14.36725849s" podCreationTimestamp="2025-10-07 13:18:28 +0000 UTC" firstStartedPulling="2025-10-07 13:18:33.518611085 +0000 UTC m=+1065.679333762" lastFinishedPulling="2025-10-07 13:18:38.723659122 +0000 UTC m=+1070.884381799" observedRunningTime="2025-10-07 13:18:42.361297016 +0000 UTC m=+1074.522019693" watchObservedRunningTime="2025-10-07 13:18:42.36725849 +0000 UTC m=+1074.527981167" Oct 07 13:18:45 crc kubenswrapper[4959]: I1007 13:18:45.362553 4959 generic.go:334] "Generic (PLEG): container finished" podID="cdeea2ce-cd7b-4608-8dc8-bab322fc76db" containerID="01acbc9e010509d954b70d866b7db920dd22a35b9badcd858e2f3b93f378d348" exitCode=0 Oct 07 13:18:45 crc kubenswrapper[4959]: I1007 13:18:45.362640 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xphbd" event={"ID":"cdeea2ce-cd7b-4608-8dc8-bab322fc76db","Type":"ContainerDied","Data":"01acbc9e010509d954b70d866b7db920dd22a35b9badcd858e2f3b93f378d348"} Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.739747 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.925093 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxx4\" (UniqueName: \"kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4\") pod \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.925171 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data\") pod \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.925231 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle\") pod \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.925302 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts\") pod \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\" (UID: \"cdeea2ce-cd7b-4608-8dc8-bab322fc76db\") " Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.929997 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts" (OuterVolumeSpecName: "scripts") pod "cdeea2ce-cd7b-4608-8dc8-bab322fc76db" (UID: "cdeea2ce-cd7b-4608-8dc8-bab322fc76db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.930076 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4" (OuterVolumeSpecName: "kube-api-access-sfxx4") pod "cdeea2ce-cd7b-4608-8dc8-bab322fc76db" (UID: "cdeea2ce-cd7b-4608-8dc8-bab322fc76db"). InnerVolumeSpecName "kube-api-access-sfxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.949584 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdeea2ce-cd7b-4608-8dc8-bab322fc76db" (UID: "cdeea2ce-cd7b-4608-8dc8-bab322fc76db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:46 crc kubenswrapper[4959]: I1007 13:18:46.952923 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data" (OuterVolumeSpecName: "config-data") pod "cdeea2ce-cd7b-4608-8dc8-bab322fc76db" (UID: "cdeea2ce-cd7b-4608-8dc8-bab322fc76db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.027242 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxx4\" (UniqueName: \"kubernetes.io/projected/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-kube-api-access-sfxx4\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.027270 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.027280 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.027289 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdeea2ce-cd7b-4608-8dc8-bab322fc76db-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.381693 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xphbd" event={"ID":"cdeea2ce-cd7b-4608-8dc8-bab322fc76db","Type":"ContainerDied","Data":"94190892a115a6250d55024970b8ac547b2b17efe390995950a934fe4f3268c4"} Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.381742 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94190892a115a6250d55024970b8ac547b2b17efe390995950a934fe4f3268c4" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.381785 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xphbd" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.466921 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:18:47 crc kubenswrapper[4959]: E1007 13:18:47.467291 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdeea2ce-cd7b-4608-8dc8-bab322fc76db" containerName="nova-cell0-conductor-db-sync" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.467307 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdeea2ce-cd7b-4608-8dc8-bab322fc76db" containerName="nova-cell0-conductor-db-sync" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.467489 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdeea2ce-cd7b-4608-8dc8-bab322fc76db" containerName="nova-cell0-conductor-db-sync" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.468421 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.470335 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z6zp7" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.472975 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.475528 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.639022 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56p2\" (UniqueName: \"kubernetes.io/projected/bcabf204-0890-4bfc-9a94-b921b3011603-kube-api-access-t56p2\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.639256 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.639383 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.740850 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.740930 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.740991 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56p2\" (UniqueName: \"kubernetes.io/projected/bcabf204-0890-4bfc-9a94-b921b3011603-kube-api-access-t56p2\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.745887 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.749408 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcabf204-0890-4bfc-9a94-b921b3011603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.759455 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56p2\" (UniqueName: \"kubernetes.io/projected/bcabf204-0890-4bfc-9a94-b921b3011603-kube-api-access-t56p2\") pod \"nova-cell0-conductor-0\" (UID: \"bcabf204-0890-4bfc-9a94-b921b3011603\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:47 crc kubenswrapper[4959]: I1007 13:18:47.787725 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:48 crc kubenswrapper[4959]: I1007 13:18:48.202054 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:18:48 crc kubenswrapper[4959]: I1007 13:18:48.391200 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcabf204-0890-4bfc-9a94-b921b3011603","Type":"ContainerStarted","Data":"1d95392657a6a98040c773ae090daa6ab84c9018416eeb0f0efc71f289331507"} Oct 07 13:18:49 crc kubenswrapper[4959]: I1007 13:18:49.401534 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcabf204-0890-4bfc-9a94-b921b3011603","Type":"ContainerStarted","Data":"4eab5cd88bf4cfbc3656877f1259afb9d6011b5785bb51b96aaf8e6edbce9081"} Oct 07 13:18:49 crc kubenswrapper[4959]: I1007 13:18:49.403121 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:49 crc kubenswrapper[4959]: I1007 13:18:49.424712 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.424695323 podStartE2EDuration="2.424695323s" podCreationTimestamp="2025-10-07 13:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:18:49.419508361 +0000 UTC m=+1081.580231038" watchObservedRunningTime="2025-10-07 13:18:49.424695323 +0000 UTC m=+1081.585418000" Oct 07 13:18:57 crc kubenswrapper[4959]: I1007 13:18:57.812074 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.236297 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-l98vg"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.237648 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.239594 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.239741 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.250328 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l98vg"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.318034 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtn2\" (UniqueName: \"kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.318154 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.318232 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.318350 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.365648 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.366905 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.371489 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.379846 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420142 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420217 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420257 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420276 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qtn2\" (UniqueName: \"kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420330 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420347 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.420386 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.430928 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.433364 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.439659 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.444026 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qtn2\" (UniqueName: \"kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2\") pod \"nova-cell0-cell-mapping-l98vg\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.468522 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.469884 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.472117 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.487459 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.522722 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.522779 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.522855 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.522968 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.523003 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps9h\" (UniqueName: \"kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.523029 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.523054 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.525656 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.527028 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.531096 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.536371 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.553053 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.557457 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.574259 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf\") pod \"nova-cell1-novncproxy-0\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.616340 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626100 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626182 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626297 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626346 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9skg2\" (UniqueName: \"kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626392 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps9h\" (UniqueName: \"kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626413 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626480 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.626516 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.627052 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.639356 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.641506 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.671554 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps9h\" (UniqueName: \"kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h\") pod \"nova-metadata-0\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.689535 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.696566 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.712720 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.730033 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.730118 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.730152 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9skg2\" (UniqueName: \"kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.730215 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.730800 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.732206 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.740433 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.746694 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.753283 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.764837 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.765894 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.776923 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.783167 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9skg2\" (UniqueName: \"kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2\") pod \"nova-api-0\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " pod="openstack/nova-api-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.790348 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834168 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834213 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834278 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7khb\" (UniqueName: \"kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834335 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834368 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834416 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.834445 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84t5b\" (UniqueName: \"kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.919050 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936420 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936462 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936484 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936517 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7khb\" (UniqueName: \"kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936581 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936656 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936692 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.936720 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84t5b\" (UniqueName: \"kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.940171 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.942722 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.943084 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.949312 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.951297 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.952048 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.959088 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84t5b\" (UniqueName: \"kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b\") pod \"dnsmasq-dns-54974c8ff5-2tnwp\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:58 crc kubenswrapper[4959]: I1007 13:18:58.973234 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7khb\" (UniqueName: \"kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb\") pod \"nova-scheduler-0\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " pod="openstack/nova-scheduler-0" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.024016 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.078851 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.097334 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.263711 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l98vg"] Oct 07 13:18:59 crc kubenswrapper[4959]: W1007 13:18:59.274700 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41dde621_534c_4f39_ab29_baa7401101a8.slice/crio-ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d WatchSource:0}: Error finding container ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d: Status 404 returned error can't find the container with id ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.395385 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.516503 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.522460 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerStarted","Data":"e921768852c3385959fa3b1a0fc19fc7f092596b6eb7d9eb24d058e21f9232d2"} Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.524134 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l98vg" event={"ID":"41dde621-534c-4f39-ab29-baa7401101a8","Type":"ContainerStarted","Data":"3d549af519014fd957043cd75c7f5cc9ef727243f01d35f9d72e53fea55d5529"} Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.524166 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l98vg" event={"ID":"41dde621-534c-4f39-ab29-baa7401101a8","Type":"ContainerStarted","Data":"ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d"} Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.595049 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-l98vg" podStartSLOduration=1.595027774 podStartE2EDuration="1.595027774s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:18:59.542352243 +0000 UTC m=+1091.703074920" watchObservedRunningTime="2025-10-07 13:18:59.595027774 +0000 UTC m=+1091.755750441" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.600367 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.673809 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9g4tj"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.674936 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.682057 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.682098 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.704928 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9g4tj"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.713617 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.757418 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhn7z\" (UniqueName: \"kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.757891 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.758060 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.758184 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.762740 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.859187 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.859306 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.859812 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.859871 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhn7z\" (UniqueName: \"kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.863302 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.863549 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.867150 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.879196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhn7z\" (UniqueName: \"kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z\") pod \"nova-cell1-conductor-db-sync-9g4tj\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:18:59 crc kubenswrapper[4959]: I1007 13:18:59.993747 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.488083 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9g4tj"] Oct 07 13:19:00 crc kubenswrapper[4959]: W1007 13:19:00.504350 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b53f390_b59f_45fd_8ec7_e405e011f07d.slice/crio-4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a WatchSource:0}: Error finding container 4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a: Status 404 returned error can't find the container with id 4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.544671 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54","Type":"ContainerStarted","Data":"89f04fa55561803593093d5cf32a43d5c95a7615505aa24f424e37b0b9c38dca"} Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.551533 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerStarted","Data":"947bb3ae400c420b1a2006ba50b0ed92ae495c0cdc637e7e0adabf68ff803e60"} Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.556481 4959 generic.go:334] "Generic (PLEG): container finished" podID="754c17de-0a0b-4307-83b8-52cec2996433" containerID="ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c" exitCode=0 Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.556547 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" event={"ID":"754c17de-0a0b-4307-83b8-52cec2996433","Type":"ContainerDied","Data":"ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c"} Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.556573 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" event={"ID":"754c17de-0a0b-4307-83b8-52cec2996433","Type":"ContainerStarted","Data":"e6ae1108a6d4f316776415a430bd16a88fe8a23d1d043ad47932be47c04a2fd3"} Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.568363 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"768644fb-d5ea-43ba-8277-7864670945ec","Type":"ContainerStarted","Data":"f08bcbaed10a72aa14e335c9c63f7650e5afbecb33f26ecbfe9802a4737bb8ea"} Oct 07 13:19:00 crc kubenswrapper[4959]: I1007 13:19:00.608595 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" event={"ID":"8b53f390-b59f-45fd-8ec7-e405e011f07d","Type":"ContainerStarted","Data":"4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a"} Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.616156 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" event={"ID":"754c17de-0a0b-4307-83b8-52cec2996433","Type":"ContainerStarted","Data":"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947"} Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.616493 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.619423 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" event={"ID":"8b53f390-b59f-45fd-8ec7-e405e011f07d","Type":"ContainerStarted","Data":"45990344e2031840c1955601b359ba30a984e300f695da70e737a9e39b96fedd"} Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.639524 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" podStartSLOduration=3.639506298 podStartE2EDuration="3.639506298s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:01.63545907 +0000 UTC m=+1093.796181747" watchObservedRunningTime="2025-10-07 13:19:01.639506298 +0000 UTC m=+1093.800228975" Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.654553 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" podStartSLOduration=2.654536608 podStartE2EDuration="2.654536608s" podCreationTimestamp="2025-10-07 13:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:01.648838041 +0000 UTC m=+1093.809560718" watchObservedRunningTime="2025-10-07 13:19:01.654536608 +0000 UTC m=+1093.815259285" Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.968439 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:01 crc kubenswrapper[4959]: I1007 13:19:01.978605 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.646718 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54","Type":"ContainerStarted","Data":"bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.648892 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerStarted","Data":"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.648936 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerStarted","Data":"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.651162 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerStarted","Data":"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.651202 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-log" containerID="cri-o://4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" gracePeriod=30 Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.651238 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerStarted","Data":"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.651444 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-metadata" containerID="cri-o://9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" gracePeriod=30 Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.653406 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"768644fb-d5ea-43ba-8277-7864670945ec","Type":"ContainerStarted","Data":"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800"} Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.653512 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="768644fb-d5ea-43ba-8277-7864670945ec" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800" gracePeriod=30 Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.661752 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.988579311 podStartE2EDuration="5.661735912s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="2025-10-07 13:18:59.765827802 +0000 UTC m=+1091.926550479" lastFinishedPulling="2025-10-07 13:19:02.438984403 +0000 UTC m=+1094.599707080" observedRunningTime="2025-10-07 13:19:03.660504016 +0000 UTC m=+1095.821226713" watchObservedRunningTime="2025-10-07 13:19:03.661735912 +0000 UTC m=+1095.822458589" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.680362 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.771022424 podStartE2EDuration="5.680337776s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="2025-10-07 13:18:59.529704812 +0000 UTC m=+1091.690427499" lastFinishedPulling="2025-10-07 13:19:02.439020174 +0000 UTC m=+1094.599742851" observedRunningTime="2025-10-07 13:19:03.67602525 +0000 UTC m=+1095.836747937" watchObservedRunningTime="2025-10-07 13:19:03.680337776 +0000 UTC m=+1095.841060473" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.690252 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.690303 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.697808 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.698896 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.88188949 podStartE2EDuration="5.698878429s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="2025-10-07 13:18:59.622081986 +0000 UTC m=+1091.782804663" lastFinishedPulling="2025-10-07 13:19:02.439070925 +0000 UTC m=+1094.599793602" observedRunningTime="2025-10-07 13:19:03.695614643 +0000 UTC m=+1095.856337320" watchObservedRunningTime="2025-10-07 13:19:03.698878429 +0000 UTC m=+1095.859601106" Oct 07 13:19:03 crc kubenswrapper[4959]: I1007 13:19:03.711678 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.673653426 podStartE2EDuration="5.711659603s" podCreationTimestamp="2025-10-07 13:18:58 +0000 UTC" firstStartedPulling="2025-10-07 13:18:59.403573212 +0000 UTC m=+1091.564295889" lastFinishedPulling="2025-10-07 13:19:02.441579389 +0000 UTC m=+1094.602302066" observedRunningTime="2025-10-07 13:19:03.711399295 +0000 UTC m=+1095.872121992" watchObservedRunningTime="2025-10-07 13:19:03.711659603 +0000 UTC m=+1095.872382280" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.099122 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.180969 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.247998 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zps9h\" (UniqueName: \"kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h\") pod \"ad11b8e3-2922-42d9-9368-f08e826ecd80\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.248064 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data\") pod \"ad11b8e3-2922-42d9-9368-f08e826ecd80\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.248170 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle\") pod \"ad11b8e3-2922-42d9-9368-f08e826ecd80\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.248248 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs\") pod \"ad11b8e3-2922-42d9-9368-f08e826ecd80\" (UID: \"ad11b8e3-2922-42d9-9368-f08e826ecd80\") " Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.248911 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs" (OuterVolumeSpecName: "logs") pod "ad11b8e3-2922-42d9-9368-f08e826ecd80" (UID: "ad11b8e3-2922-42d9-9368-f08e826ecd80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.253003 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h" (OuterVolumeSpecName: "kube-api-access-zps9h") pod "ad11b8e3-2922-42d9-9368-f08e826ecd80" (UID: "ad11b8e3-2922-42d9-9368-f08e826ecd80"). InnerVolumeSpecName "kube-api-access-zps9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.273702 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad11b8e3-2922-42d9-9368-f08e826ecd80" (UID: "ad11b8e3-2922-42d9-9368-f08e826ecd80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.280753 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data" (OuterVolumeSpecName: "config-data") pod "ad11b8e3-2922-42d9-9368-f08e826ecd80" (UID: "ad11b8e3-2922-42d9-9368-f08e826ecd80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.350168 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad11b8e3-2922-42d9-9368-f08e826ecd80-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.350215 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zps9h\" (UniqueName: \"kubernetes.io/projected/ad11b8e3-2922-42d9-9368-f08e826ecd80-kube-api-access-zps9h\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.350232 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.350245 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad11b8e3-2922-42d9-9368-f08e826ecd80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.545384 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.545691 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" containerName="kube-state-metrics" containerID="cri-o://200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4" gracePeriod=30 Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670448 4959 generic.go:334] "Generic (PLEG): container finished" podID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerID="9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" exitCode=0 Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670722 4959 generic.go:334] "Generic (PLEG): container finished" podID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerID="4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" exitCode=143 Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670764 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670657 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerDied","Data":"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4"} Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670828 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerDied","Data":"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3"} Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670842 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad11b8e3-2922-42d9-9368-f08e826ecd80","Type":"ContainerDied","Data":"e921768852c3385959fa3b1a0fc19fc7f092596b6eb7d9eb24d058e21f9232d2"} Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.670857 4959 scope.go:117] "RemoveContainer" containerID="9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.739645 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.741387 4959 scope.go:117] "RemoveContainer" containerID="4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.766133 4959 scope.go:117] "RemoveContainer" containerID="9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" Oct 07 13:19:04 crc kubenswrapper[4959]: E1007 13:19:04.773312 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4\": container with ID starting with 9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4 not found: ID does not exist" containerID="9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.773365 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4"} err="failed to get container status \"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4\": rpc error: code = NotFound desc = could not find container \"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4\": container with ID starting with 9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4 not found: ID does not exist" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.773396 4959 scope.go:117] "RemoveContainer" containerID="4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" Oct 07 13:19:04 crc kubenswrapper[4959]: E1007 13:19:04.774856 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3\": container with ID starting with 4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3 not found: ID does not exist" containerID="4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.774911 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3"} err="failed to get container status \"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3\": rpc error: code = NotFound desc = could not find container \"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3\": container with ID starting with 4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3 not found: ID does not exist" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.774946 4959 scope.go:117] "RemoveContainer" containerID="9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.775294 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4"} err="failed to get container status \"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4\": rpc error: code = NotFound desc = could not find container \"9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4\": container with ID starting with 9e671125f50a7e730c00b5ea221b6f671941000965fd589a8bb33fc63d1581e4 not found: ID does not exist" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.775331 4959 scope.go:117] "RemoveContainer" containerID="4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.775590 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3"} err="failed to get container status \"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3\": rpc error: code = NotFound desc = could not find container \"4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3\": container with ID starting with 4caebea7edcc84aa3475d31830f3aa91d902c929741ba0b2d0df9f0fc09eaff3 not found: ID does not exist" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.786657 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.797961 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:04 crc kubenswrapper[4959]: E1007 13:19:04.798494 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-metadata" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.798517 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-metadata" Oct 07 13:19:04 crc kubenswrapper[4959]: E1007 13:19:04.798557 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-log" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.798569 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-log" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.798824 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-log" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.798863 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" containerName="nova-metadata-metadata" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.801383 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.804690 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.804878 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.808789 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.823325 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad11b8e3-2922-42d9-9368-f08e826ecd80" path="/var/lib/kubelet/pods/ad11b8e3-2922-42d9-9368-f08e826ecd80/volumes" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.862335 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.862443 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.862462 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.862488 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.862524 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvpqh\" (UniqueName: \"kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.966551 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.966867 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvpqh\" (UniqueName: \"kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.966937 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.967028 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.967042 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.967370 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.974815 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.977958 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.981078 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:04 crc kubenswrapper[4959]: I1007 13:19:04.990805 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvpqh\" (UniqueName: \"kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh\") pod \"nova-metadata-0\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " pod="openstack/nova-metadata-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.070250 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.120796 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.170422 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5lvl\" (UniqueName: \"kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl\") pod \"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61\" (UID: \"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61\") " Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.174020 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl" (OuterVolumeSpecName: "kube-api-access-w5lvl") pod "2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" (UID: "2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61"). InnerVolumeSpecName "kube-api-access-w5lvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.272157 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5lvl\" (UniqueName: \"kubernetes.io/projected/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61-kube-api-access-w5lvl\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.582441 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.682427 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerStarted","Data":"d6bf3035efc6ec8e471021c45abfcda899649d7a5bf0cd887cb1d15c3d7fa0b9"} Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.684825 4959 generic.go:334] "Generic (PLEG): container finished" podID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" containerID="200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4" exitCode=2 Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.684873 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61","Type":"ContainerDied","Data":"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4"} Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.684899 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.684925 4959 scope.go:117] "RemoveContainer" containerID="200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.684910 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61","Type":"ContainerDied","Data":"f6b716966a4427083618248f96c372f67c8cdc1f1111a8d4c71c5261285ffbec"} Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.712172 4959 scope.go:117] "RemoveContainer" containerID="200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4" Oct 07 13:19:05 crc kubenswrapper[4959]: E1007 13:19:05.713165 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4\": container with ID starting with 200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4 not found: ID does not exist" containerID="200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.713205 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4"} err="failed to get container status \"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4\": rpc error: code = NotFound desc = could not find container \"200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4\": container with ID starting with 200019d76ddbeb9cf92980c3de083b38f2847d79a6d3767d33176a34578dbfb4 not found: ID does not exist" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.727965 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.737078 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.746470 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: E1007 13:19:05.746882 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" containerName="kube-state-metrics" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.746899 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" containerName="kube-state-metrics" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.747090 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" containerName="kube-state-metrics" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.747716 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.749667 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.749914 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.760092 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.805691 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.805948 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-central-agent" containerID="cri-o://1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1" gracePeriod=30 Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.806063 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="proxy-httpd" containerID="cri-o://9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8" gracePeriod=30 Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.806112 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="sg-core" containerID="cri-o://8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f" gracePeriod=30 Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.806142 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-notification-agent" containerID="cri-o://b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004" gracePeriod=30 Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.882876 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.883521 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf2g9\" (UniqueName: \"kubernetes.io/projected/44961788-4f6e-4912-a20e-4648a7760dce-kube-api-access-xf2g9\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.883574 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.883706 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.985391 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.985457 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.985538 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf2g9\" (UniqueName: \"kubernetes.io/projected/44961788-4f6e-4912-a20e-4648a7760dce-kube-api-access-xf2g9\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.985569 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.991425 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.993138 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:05 crc kubenswrapper[4959]: I1007 13:19:05.995699 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44961788-4f6e-4912-a20e-4648a7760dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.001396 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf2g9\" (UniqueName: \"kubernetes.io/projected/44961788-4f6e-4912-a20e-4648a7760dce-kube-api-access-xf2g9\") pod \"kube-state-metrics-0\" (UID: \"44961788-4f6e-4912-a20e-4648a7760dce\") " pod="openstack/kube-state-metrics-0" Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.067664 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.515504 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 13:19:06 crc kubenswrapper[4959]: W1007 13:19:06.519954 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44961788_4f6e_4912_a20e_4648a7760dce.slice/crio-79a6d56d1c4ddb6e6f16513a574adbfdac64071d5177fd9e4890d78d156606e5 WatchSource:0}: Error finding container 79a6d56d1c4ddb6e6f16513a574adbfdac64071d5177fd9e4890d78d156606e5: Status 404 returned error can't find the container with id 79a6d56d1c4ddb6e6f16513a574adbfdac64071d5177fd9e4890d78d156606e5 Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.694422 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"44961788-4f6e-4912-a20e-4648a7760dce","Type":"ContainerStarted","Data":"79a6d56d1c4ddb6e6f16513a574adbfdac64071d5177fd9e4890d78d156606e5"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.698492 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerStarted","Data":"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.698531 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerStarted","Data":"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704171 4959 generic.go:334] "Generic (PLEG): container finished" podID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerID="9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8" exitCode=0 Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704203 4959 generic.go:334] "Generic (PLEG): container finished" podID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerID="8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f" exitCode=2 Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704211 4959 generic.go:334] "Generic (PLEG): container finished" podID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerID="1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1" exitCode=0 Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704232 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerDied","Data":"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704258 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerDied","Data":"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.704317 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerDied","Data":"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1"} Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.722195 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.722173096 podStartE2EDuration="2.722173096s" podCreationTimestamp="2025-10-07 13:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:06.716037207 +0000 UTC m=+1098.876759894" watchObservedRunningTime="2025-10-07 13:19:06.722173096 +0000 UTC m=+1098.882895773" Oct 07 13:19:06 crc kubenswrapper[4959]: I1007 13:19:06.821342 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61" path="/var/lib/kubelet/pods/2b4bc70e-cbbe-4c3f-a096-f2b1b0d89e61/volumes" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.311683 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408153 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408302 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408333 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxrh8\" (UniqueName: \"kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408411 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408467 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408578 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408684 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts\") pod \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\" (UID: \"d42fa5e4-9905-42d9-b7cc-e33c6198f012\") " Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.408845 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.409133 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.409173 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.413622 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8" (OuterVolumeSpecName: "kube-api-access-dxrh8") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "kube-api-access-dxrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.415719 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts" (OuterVolumeSpecName: "scripts") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.433724 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.491852 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.511126 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d42fa5e4-9905-42d9-b7cc-e33c6198f012-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.511461 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.511599 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.511721 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxrh8\" (UniqueName: \"kubernetes.io/projected/d42fa5e4-9905-42d9-b7cc-e33c6198f012-kube-api-access-dxrh8\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.511856 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.521180 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data" (OuterVolumeSpecName: "config-data") pod "d42fa5e4-9905-42d9-b7cc-e33c6198f012" (UID: "d42fa5e4-9905-42d9-b7cc-e33c6198f012"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.613025 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42fa5e4-9905-42d9-b7cc-e33c6198f012-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.712842 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b53f390-b59f-45fd-8ec7-e405e011f07d" containerID="45990344e2031840c1955601b359ba30a984e300f695da70e737a9e39b96fedd" exitCode=0 Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.712891 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" event={"ID":"8b53f390-b59f-45fd-8ec7-e405e011f07d","Type":"ContainerDied","Data":"45990344e2031840c1955601b359ba30a984e300f695da70e737a9e39b96fedd"} Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.715587 4959 generic.go:334] "Generic (PLEG): container finished" podID="41dde621-534c-4f39-ab29-baa7401101a8" containerID="3d549af519014fd957043cd75c7f5cc9ef727243f01d35f9d72e53fea55d5529" exitCode=0 Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.715646 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l98vg" event={"ID":"41dde621-534c-4f39-ab29-baa7401101a8","Type":"ContainerDied","Data":"3d549af519014fd957043cd75c7f5cc9ef727243f01d35f9d72e53fea55d5529"} Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.718928 4959 generic.go:334] "Generic (PLEG): container finished" podID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerID="b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004" exitCode=0 Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.719084 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerDied","Data":"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004"} Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.719123 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d42fa5e4-9905-42d9-b7cc-e33c6198f012","Type":"ContainerDied","Data":"82ae37bb2056f863e9b1787e766ee0f5653c13078d96e8fe038fe5a04937f60c"} Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.719162 4959 scope.go:117] "RemoveContainer" containerID="9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.719331 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.728932 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"44961788-4f6e-4912-a20e-4648a7760dce","Type":"ContainerStarted","Data":"17367fff096194a480f97891d47887d309567a3865618698f41e5fc4c46dce8c"} Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.754101 4959 scope.go:117] "RemoveContainer" containerID="8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.784475 4959 scope.go:117] "RemoveContainer" containerID="b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.785685 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.793211 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.806083 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.455136313 podStartE2EDuration="2.806062512s" podCreationTimestamp="2025-10-07 13:19:05 +0000 UTC" firstStartedPulling="2025-10-07 13:19:06.521871765 +0000 UTC m=+1098.682594442" lastFinishedPulling="2025-10-07 13:19:06.872797964 +0000 UTC m=+1099.033520641" observedRunningTime="2025-10-07 13:19:07.795500783 +0000 UTC m=+1099.956223460" watchObservedRunningTime="2025-10-07 13:19:07.806062512 +0000 UTC m=+1099.966785189" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.810182 4959 scope.go:117] "RemoveContainer" containerID="1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.833777 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.834895 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="proxy-httpd" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.834923 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="proxy-httpd" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.834947 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-central-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.834982 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-central-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.835006 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="sg-core" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835014 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="sg-core" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.835058 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-notification-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835067 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-notification-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835506 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-central-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835557 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="sg-core" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835570 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="proxy-httpd" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.835597 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" containerName="ceilometer-notification-agent" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.849983 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.853564 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.853880 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.854337 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.854989 4959 scope.go:117] "RemoveContainer" containerID="9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.855379 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8\": container with ID starting with 9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8 not found: ID does not exist" containerID="9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.855416 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8"} err="failed to get container status \"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8\": rpc error: code = NotFound desc = could not find container \"9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8\": container with ID starting with 9bd9fed9f02c7ee54d62e850f66d96c3800573d2386353aa7fd7fdfb5d741bf8 not found: ID does not exist" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.855454 4959 scope.go:117] "RemoveContainer" containerID="8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.855904 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f\": container with ID starting with 8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f not found: ID does not exist" containerID="8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.855969 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f"} err="failed to get container status \"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f\": rpc error: code = NotFound desc = could not find container \"8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f\": container with ID starting with 8bd0d20a1bb4882dcd970d32ea750c5a028fd2e27c5d11b12bd24ba69667ce2f not found: ID does not exist" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.855988 4959 scope.go:117] "RemoveContainer" containerID="b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.856012 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.856814 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004\": container with ID starting with b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004 not found: ID does not exist" containerID="b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.856872 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004"} err="failed to get container status \"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004\": rpc error: code = NotFound desc = could not find container \"b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004\": container with ID starting with b23764e6f4a680f7a1fd1c0900092f87e4274d38b9946723d520426389dd1004 not found: ID does not exist" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.856892 4959 scope.go:117] "RemoveContainer" containerID="1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1" Oct 07 13:19:07 crc kubenswrapper[4959]: E1007 13:19:07.859357 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1\": container with ID starting with 1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1 not found: ID does not exist" containerID="1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.859389 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1"} err="failed to get container status \"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1\": rpc error: code = NotFound desc = could not find container \"1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1\": container with ID starting with 1ca296600d79ee878b3f536f85f9e86e590f1e2057e579b9328264b16f994bb1 not found: ID does not exist" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943188 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfg4j\" (UniqueName: \"kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943260 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943290 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943304 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943341 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943388 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943438 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:07 crc kubenswrapper[4959]: I1007 13:19:07.943495 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044671 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044726 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044774 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044832 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044877 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfg4j\" (UniqueName: \"kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044900 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044922 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.044939 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.045368 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.045563 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.049489 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.049719 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.050376 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.059117 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.060462 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.062212 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfg4j\" (UniqueName: \"kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j\") pod \"ceilometer-0\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.173390 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.575378 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:08 crc kubenswrapper[4959]: W1007 13:19:08.580210 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a44aa4c_74c8_4334_b63e_1f845950d1c1.slice/crio-649e148dd93ec2aefcaffb40faf03c46bc4347968e0f9ebd099b0465e39ec34d WatchSource:0}: Error finding container 649e148dd93ec2aefcaffb40faf03c46bc4347968e0f9ebd099b0465e39ec34d: Status 404 returned error can't find the container with id 649e148dd93ec2aefcaffb40faf03c46bc4347968e0f9ebd099b0465e39ec34d Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.739300 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerStarted","Data":"649e148dd93ec2aefcaffb40faf03c46bc4347968e0f9ebd099b0465e39ec34d"} Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.742051 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 13:19:08 crc kubenswrapper[4959]: I1007 13:19:08.819588 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42fa5e4-9905-42d9-b7cc-e33c6198f012" path="/var/lib/kubelet/pods/d42fa5e4-9905-42d9-b7cc-e33c6198f012/volumes" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.026066 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.026106 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.087794 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.098923 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.158564 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.158730 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.159135 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85494b87f-gljj2" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="dnsmasq-dns" containerID="cri-o://6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017" gracePeriod=10 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.224755 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85494b87f-gljj2" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.262132 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.271121 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369535 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts\") pod \"41dde621-534c-4f39-ab29-baa7401101a8\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369590 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle\") pod \"8b53f390-b59f-45fd-8ec7-e405e011f07d\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369662 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts\") pod \"8b53f390-b59f-45fd-8ec7-e405e011f07d\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369757 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhn7z\" (UniqueName: \"kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z\") pod \"8b53f390-b59f-45fd-8ec7-e405e011f07d\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369864 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qtn2\" (UniqueName: \"kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2\") pod \"41dde621-534c-4f39-ab29-baa7401101a8\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369891 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data\") pod \"8b53f390-b59f-45fd-8ec7-e405e011f07d\" (UID: \"8b53f390-b59f-45fd-8ec7-e405e011f07d\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369913 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data\") pod \"41dde621-534c-4f39-ab29-baa7401101a8\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.369967 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle\") pod \"41dde621-534c-4f39-ab29-baa7401101a8\" (UID: \"41dde621-534c-4f39-ab29-baa7401101a8\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.376823 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z" (OuterVolumeSpecName: "kube-api-access-nhn7z") pod "8b53f390-b59f-45fd-8ec7-e405e011f07d" (UID: "8b53f390-b59f-45fd-8ec7-e405e011f07d"). InnerVolumeSpecName "kube-api-access-nhn7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.377115 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2" (OuterVolumeSpecName: "kube-api-access-2qtn2") pod "41dde621-534c-4f39-ab29-baa7401101a8" (UID: "41dde621-534c-4f39-ab29-baa7401101a8"). InnerVolumeSpecName "kube-api-access-2qtn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.380799 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts" (OuterVolumeSpecName: "scripts") pod "41dde621-534c-4f39-ab29-baa7401101a8" (UID: "41dde621-534c-4f39-ab29-baa7401101a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.381581 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts" (OuterVolumeSpecName: "scripts") pod "8b53f390-b59f-45fd-8ec7-e405e011f07d" (UID: "8b53f390-b59f-45fd-8ec7-e405e011f07d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.409848 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41dde621-534c-4f39-ab29-baa7401101a8" (UID: "41dde621-534c-4f39-ab29-baa7401101a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.411455 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data" (OuterVolumeSpecName: "config-data") pod "41dde621-534c-4f39-ab29-baa7401101a8" (UID: "41dde621-534c-4f39-ab29-baa7401101a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.418008 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data" (OuterVolumeSpecName: "config-data") pod "8b53f390-b59f-45fd-8ec7-e405e011f07d" (UID: "8b53f390-b59f-45fd-8ec7-e405e011f07d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.429714 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b53f390-b59f-45fd-8ec7-e405e011f07d" (UID: "8b53f390-b59f-45fd-8ec7-e405e011f07d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.472807 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473132 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473199 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473256 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473331 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhn7z\" (UniqueName: \"kubernetes.io/projected/8b53f390-b59f-45fd-8ec7-e405e011f07d-kube-api-access-nhn7z\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473414 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qtn2\" (UniqueName: \"kubernetes.io/projected/41dde621-534c-4f39-ab29-baa7401101a8-kube-api-access-2qtn2\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473488 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b53f390-b59f-45fd-8ec7-e405e011f07d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.473577 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dde621-534c-4f39-ab29-baa7401101a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.651220 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.749842 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" event={"ID":"8b53f390-b59f-45fd-8ec7-e405e011f07d","Type":"ContainerDied","Data":"4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a"} Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.749876 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9g4tj" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.749881 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a2f13cb6c1edb016274e0390b768f75084684d8a36daf0b5b39233c11089b0a" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.751718 4959 generic.go:334] "Generic (PLEG): container finished" podID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerID="6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017" exitCode=0 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.751767 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85494b87f-gljj2" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.751787 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85494b87f-gljj2" event={"ID":"32a05d93-0c05-45c0-872e-decc26f3fb0e","Type":"ContainerDied","Data":"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017"} Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.751814 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85494b87f-gljj2" event={"ID":"32a05d93-0c05-45c0-872e-decc26f3fb0e","Type":"ContainerDied","Data":"6eaf6b62d947fac20be89ffb3eb65058ded3eea830af1a77a7e9aba9b0738a50"} Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.751835 4959 scope.go:117] "RemoveContainer" containerID="6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.754824 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l98vg" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.757225 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l98vg" event={"ID":"41dde621-534c-4f39-ab29-baa7401101a8","Type":"ContainerDied","Data":"ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d"} Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.757263 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab2620eb2fd8ae975b81b70ecffe2d4e7ea6d294b27601d79d58465043e8ea6d" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.778207 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l96l8\" (UniqueName: \"kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8\") pod \"32a05d93-0c05-45c0-872e-decc26f3fb0e\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.778457 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb\") pod \"32a05d93-0c05-45c0-872e-decc26f3fb0e\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.778579 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb\") pod \"32a05d93-0c05-45c0-872e-decc26f3fb0e\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.778697 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config\") pod \"32a05d93-0c05-45c0-872e-decc26f3fb0e\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.778855 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc\") pod \"32a05d93-0c05-45c0-872e-decc26f3fb0e\" (UID: \"32a05d93-0c05-45c0-872e-decc26f3fb0e\") " Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.795859 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8" (OuterVolumeSpecName: "kube-api-access-l96l8") pod "32a05d93-0c05-45c0-872e-decc26f3fb0e" (UID: "32a05d93-0c05-45c0-872e-decc26f3fb0e"). InnerVolumeSpecName "kube-api-access-l96l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.836734 4959 scope.go:117] "RemoveContainer" containerID="99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.842697 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.843103 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="init" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843117 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="init" Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.843131 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="dnsmasq-dns" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843137 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="dnsmasq-dns" Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.843147 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b53f390-b59f-45fd-8ec7-e405e011f07d" containerName="nova-cell1-conductor-db-sync" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843152 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b53f390-b59f-45fd-8ec7-e405e011f07d" containerName="nova-cell1-conductor-db-sync" Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.843172 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dde621-534c-4f39-ab29-baa7401101a8" containerName="nova-manage" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843181 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dde621-534c-4f39-ab29-baa7401101a8" containerName="nova-manage" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843340 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dde621-534c-4f39-ab29-baa7401101a8" containerName="nova-manage" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843356 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" containerName="dnsmasq-dns" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843368 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b53f390-b59f-45fd-8ec7-e405e011f07d" containerName="nova-cell1-conductor-db-sync" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.843941 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.846475 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.850144 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.859942 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.860797 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config" (OuterVolumeSpecName: "config") pod "32a05d93-0c05-45c0-872e-decc26f3fb0e" (UID: "32a05d93-0c05-45c0-872e-decc26f3fb0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.866083 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a05d93-0c05-45c0-872e-decc26f3fb0e" (UID: "32a05d93-0c05-45c0-872e-decc26f3fb0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.882752 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a05d93-0c05-45c0-872e-decc26f3fb0e" (UID: "32a05d93-0c05-45c0-872e-decc26f3fb0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.883861 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l96l8\" (UniqueName: \"kubernetes.io/projected/32a05d93-0c05-45c0-872e-decc26f3fb0e-kube-api-access-l96l8\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.883881 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.883905 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.883914 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.884110 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a05d93-0c05-45c0-872e-decc26f3fb0e" (UID: "32a05d93-0c05-45c0-872e-decc26f3fb0e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.893846 4959 scope.go:117] "RemoveContainer" containerID="6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017" Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.897236 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017\": container with ID starting with 6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017 not found: ID does not exist" containerID="6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.897267 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017"} err="failed to get container status \"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017\": rpc error: code = NotFound desc = could not find container \"6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017\": container with ID starting with 6187a290a5801d6dd22825530cdcad8dd9603fb8938f63b225153bcca1176017 not found: ID does not exist" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.897316 4959 scope.go:117] "RemoveContainer" containerID="99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef" Oct 07 13:19:09 crc kubenswrapper[4959]: E1007 13:19:09.900779 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef\": container with ID starting with 99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef not found: ID does not exist" containerID="99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.900928 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef"} err="failed to get container status \"99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef\": rpc error: code = NotFound desc = could not find container \"99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef\": container with ID starting with 99b68d7172bcb6622350ffbc6d525c2cf606731799c49d69f9fc12f2ac81fcef not found: ID does not exist" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.928974 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.929163 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-log" containerID="cri-o://e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543" gracePeriod=30 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.929299 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-api" containerID="cri-o://f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe" gracePeriod=30 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.937400 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": EOF" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.937403 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": EOF" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.979461 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.979751 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-log" containerID="cri-o://a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" gracePeriod=30 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.980052 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-metadata" containerID="cri-o://fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" gracePeriod=30 Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.989130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmdh\" (UniqueName: \"kubernetes.io/projected/8446fa80-aebe-45ba-a6a7-4f51402f3d38-kube-api-access-5kmdh\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.989189 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.989302 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:09 crc kubenswrapper[4959]: I1007 13:19:09.989440 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a05d93-0c05-45c0-872e-decc26f3fb0e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.087736 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.090614 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmdh\" (UniqueName: \"kubernetes.io/projected/8446fa80-aebe-45ba-a6a7-4f51402f3d38-kube-api-access-5kmdh\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.090679 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.090719 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.092198 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85494b87f-gljj2"] Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.094533 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.095616 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8446fa80-aebe-45ba-a6a7-4f51402f3d38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.109378 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmdh\" (UniqueName: \"kubernetes.io/projected/8446fa80-aebe-45ba-a6a7-4f51402f3d38-kube-api-access-5kmdh\") pod \"nova-cell1-conductor-0\" (UID: \"8446fa80-aebe-45ba-a6a7-4f51402f3d38\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.120927 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.120980 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.211856 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.394764 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.614729 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.703051 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data\") pod \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.703124 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvpqh\" (UniqueName: \"kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh\") pod \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.703163 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle\") pod \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.703193 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs\") pod \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.703297 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs\") pod \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\" (UID: \"f6ff7ca2-9dc3-4b68-856d-19e508aded87\") " Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.704068 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs" (OuterVolumeSpecName: "logs") pod "f6ff7ca2-9dc3-4b68-856d-19e508aded87" (UID: "f6ff7ca2-9dc3-4b68-856d-19e508aded87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.708533 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh" (OuterVolumeSpecName: "kube-api-access-vvpqh") pod "f6ff7ca2-9dc3-4b68-856d-19e508aded87" (UID: "f6ff7ca2-9dc3-4b68-856d-19e508aded87"). InnerVolumeSpecName "kube-api-access-vvpqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.728402 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data" (OuterVolumeSpecName: "config-data") pod "f6ff7ca2-9dc3-4b68-856d-19e508aded87" (UID: "f6ff7ca2-9dc3-4b68-856d-19e508aded87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.729863 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6ff7ca2-9dc3-4b68-856d-19e508aded87" (UID: "f6ff7ca2-9dc3-4b68-856d-19e508aded87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.766422 4959 generic.go:334] "Generic (PLEG): container finished" podID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerID="e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543" exitCode=143 Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.768529 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerDied","Data":"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543"} Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772746 4959 generic.go:334] "Generic (PLEG): container finished" podID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerID="fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" exitCode=0 Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772778 4959 generic.go:334] "Generic (PLEG): container finished" podID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerID="a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" exitCode=143 Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772823 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerDied","Data":"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd"} Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772846 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerDied","Data":"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5"} Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772857 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6ff7ca2-9dc3-4b68-856d-19e508aded87","Type":"ContainerDied","Data":"d6bf3035efc6ec8e471021c45abfcda899649d7a5bf0cd887cb1d15c3d7fa0b9"} Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772871 4959 scope.go:117] "RemoveContainer" containerID="fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772977 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.772978 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f6ff7ca2-9dc3-4b68-856d-19e508aded87" (UID: "f6ff7ca2-9dc3-4b68-856d-19e508aded87"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.779312 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:19:10 crc kubenswrapper[4959]: W1007 13:19:10.780755 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8446fa80_aebe_45ba_a6a7_4f51402f3d38.slice/crio-6bf1f557b50803ad27c22b42c04c5cb83982aa9155a36f611654489ace9d2067 WatchSource:0}: Error finding container 6bf1f557b50803ad27c22b42c04c5cb83982aa9155a36f611654489ace9d2067: Status 404 returned error can't find the container with id 6bf1f557b50803ad27c22b42c04c5cb83982aa9155a36f611654489ace9d2067 Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.817383 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.817418 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvpqh\" (UniqueName: \"kubernetes.io/projected/f6ff7ca2-9dc3-4b68-856d-19e508aded87-kube-api-access-vvpqh\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.817432 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.817444 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ff7ca2-9dc3-4b68-856d-19e508aded87-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.817455 4959 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ff7ca2-9dc3-4b68-856d-19e508aded87-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.826726 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a05d93-0c05-45c0-872e-decc26f3fb0e" path="/var/lib/kubelet/pods/32a05d93-0c05-45c0-872e-decc26f3fb0e/volumes" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.827687 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.828911 4959 scope.go:117] "RemoveContainer" containerID="a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.847712 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.874695 4959 scope.go:117] "RemoveContainer" containerID="fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" Oct 07 13:19:10 crc kubenswrapper[4959]: E1007 13:19:10.875914 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd\": container with ID starting with fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd not found: ID does not exist" containerID="fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.875983 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd"} err="failed to get container status \"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd\": rpc error: code = NotFound desc = could not find container \"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd\": container with ID starting with fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd not found: ID does not exist" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.876012 4959 scope.go:117] "RemoveContainer" containerID="a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" Oct 07 13:19:10 crc kubenswrapper[4959]: E1007 13:19:10.877269 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5\": container with ID starting with a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5 not found: ID does not exist" containerID="a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.877310 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5"} err="failed to get container status \"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5\": rpc error: code = NotFound desc = could not find container \"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5\": container with ID starting with a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5 not found: ID does not exist" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.877342 4959 scope.go:117] "RemoveContainer" containerID="fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.877683 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd"} err="failed to get container status \"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd\": rpc error: code = NotFound desc = could not find container \"fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd\": container with ID starting with fef8cfecb7eca90cc142cc572ac3c9d3cd6959cd0c030d9873b26128f67c44fd not found: ID does not exist" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.877708 4959 scope.go:117] "RemoveContainer" containerID="a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883006 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5"} err="failed to get container status \"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5\": rpc error: code = NotFound desc = could not find container \"a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5\": container with ID starting with a07fea589f22dfc6b7ef3a5e83cc3b5fc0e25800a2b5b46210cba76dbaf8e0f5 not found: ID does not exist" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883059 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:10 crc kubenswrapper[4959]: E1007 13:19:10.883486 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-metadata" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883507 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-metadata" Oct 07 13:19:10 crc kubenswrapper[4959]: E1007 13:19:10.883557 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-log" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883566 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-log" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883795 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-metadata" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.883828 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" containerName="nova-metadata-log" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.885675 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.890927 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.890989 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 13:19:10 crc kubenswrapper[4959]: I1007 13:19:10.896152 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.020461 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.020522 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.020550 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.020810 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xth7\" (UniqueName: \"kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.020834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.122248 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xth7\" (UniqueName: \"kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.122297 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.122322 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.122360 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.122384 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.123604 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.126355 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.126502 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.126721 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.146649 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xth7\" (UniqueName: \"kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7\") pod \"nova-metadata-0\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.212751 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.659149 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.793137 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerStarted","Data":"3615f8a9841d9c04333f7c02a933c97033d23b51c1961487bd386c070b259768"} Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.796254 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8446fa80-aebe-45ba-a6a7-4f51402f3d38","Type":"ContainerStarted","Data":"58f9084c810528eaa528a1f92dbd8f95f9b9a49275bca68f13e7972311cfe889"} Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.796295 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8446fa80-aebe-45ba-a6a7-4f51402f3d38","Type":"ContainerStarted","Data":"6bf1f557b50803ad27c22b42c04c5cb83982aa9155a36f611654489ace9d2067"} Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.796395 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerName="nova-scheduler-scheduler" containerID="cri-o://bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" gracePeriod=30 Oct 07 13:19:11 crc kubenswrapper[4959]: I1007 13:19:11.819312 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.819294736 podStartE2EDuration="2.819294736s" podCreationTimestamp="2025-10-07 13:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:11.819149291 +0000 UTC m=+1103.979871978" watchObservedRunningTime="2025-10-07 13:19:11.819294736 +0000 UTC m=+1103.980017413" Oct 07 13:19:12 crc kubenswrapper[4959]: I1007 13:19:12.841678 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ff7ca2-9dc3-4b68-856d-19e508aded87" path="/var/lib/kubelet/pods/f6ff7ca2-9dc3-4b68-856d-19e508aded87/volumes" Oct 07 13:19:12 crc kubenswrapper[4959]: I1007 13:19:12.843351 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.84330031 podStartE2EDuration="2.84330031s" podCreationTimestamp="2025-10-07 13:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:12.84260266 +0000 UTC m=+1105.003325357" watchObservedRunningTime="2025-10-07 13:19:12.84330031 +0000 UTC m=+1105.004022987" Oct 07 13:19:12 crc kubenswrapper[4959]: I1007 13:19:12.843815 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerStarted","Data":"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b"} Oct 07 13:19:12 crc kubenswrapper[4959]: I1007 13:19:12.843895 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerStarted","Data":"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63"} Oct 07 13:19:12 crc kubenswrapper[4959]: I1007 13:19:12.843953 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:13 crc kubenswrapper[4959]: I1007 13:19:13.832044 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerStarted","Data":"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f"} Oct 07 13:19:14 crc kubenswrapper[4959]: E1007 13:19:14.100218 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:14 crc kubenswrapper[4959]: E1007 13:19:14.102279 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:14 crc kubenswrapper[4959]: E1007 13:19:14.103788 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:14 crc kubenswrapper[4959]: E1007 13:19:14.104125 4959 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerName="nova-scheduler-scheduler" Oct 07 13:19:14 crc kubenswrapper[4959]: I1007 13:19:14.862609 4959 generic.go:334] "Generic (PLEG): container finished" podID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerID="bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" exitCode=0 Oct 07 13:19:14 crc kubenswrapper[4959]: I1007 13:19:14.862899 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54","Type":"ContainerDied","Data":"bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d"} Oct 07 13:19:14 crc kubenswrapper[4959]: I1007 13:19:14.864795 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerStarted","Data":"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484"} Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.081918 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.191589 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle\") pod \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.191752 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7khb\" (UniqueName: \"kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb\") pod \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.191893 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data\") pod \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\" (UID: \"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.197098 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb" (OuterVolumeSpecName: "kube-api-access-v7khb") pod "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" (UID: "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54"). InnerVolumeSpecName "kube-api-access-v7khb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.217824 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data" (OuterVolumeSpecName: "config-data") pod "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" (UID: "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.238305 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.252183 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" (UID: "3f82f728-0fae-4cfd-8d4b-79c04e3b8b54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.294423 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.294465 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.294482 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7khb\" (UniqueName: \"kubernetes.io/projected/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54-kube-api-access-v7khb\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.773546 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.906238 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle\") pod \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.906825 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs\") pod \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.906910 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9skg2\" (UniqueName: \"kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2\") pod \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.906977 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data\") pod \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\" (UID: \"8dd1506f-f443-4edf-8a68-ae3f2228ebbc\") " Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.916980 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerStarted","Data":"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3"} Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.917167 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs" (OuterVolumeSpecName: "logs") pod "8dd1506f-f443-4edf-8a68-ae3f2228ebbc" (UID: "8dd1506f-f443-4edf-8a68-ae3f2228ebbc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.928949 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2" (OuterVolumeSpecName: "kube-api-access-9skg2") pod "8dd1506f-f443-4edf-8a68-ae3f2228ebbc" (UID: "8dd1506f-f443-4edf-8a68-ae3f2228ebbc"). InnerVolumeSpecName "kube-api-access-9skg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.941445 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3f82f728-0fae-4cfd-8d4b-79c04e3b8b54","Type":"ContainerDied","Data":"89f04fa55561803593093d5cf32a43d5c95a7615505aa24f424e37b0b9c38dca"} Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.941503 4959 scope.go:117] "RemoveContainer" containerID="bd78e86e2e15e511646dd1c86d6d2d77438e78d49f7e2cd3921031b9b3c76c3d" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.941698 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.967588 4959 generic.go:334] "Generic (PLEG): container finished" podID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerID="f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe" exitCode=0 Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.967654 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerDied","Data":"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe"} Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.967680 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8dd1506f-f443-4edf-8a68-ae3f2228ebbc","Type":"ContainerDied","Data":"947bb3ae400c420b1a2006ba50b0ed92ae495c0cdc637e7e0adabf68ff803e60"} Oct 07 13:19:15 crc kubenswrapper[4959]: I1007 13:19:15.967755 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.009924 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.009956 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9skg2\" (UniqueName: \"kubernetes.io/projected/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-kube-api-access-9skg2\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.026542 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.039128 4959 scope.go:117] "RemoveContainer" containerID="f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.041472 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data" (OuterVolumeSpecName: "config-data") pod "8dd1506f-f443-4edf-8a68-ae3f2228ebbc" (UID: "8dd1506f-f443-4edf-8a68-ae3f2228ebbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.043684 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.055683 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: E1007 13:19:16.056119 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerName="nova-scheduler-scheduler" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056135 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerName="nova-scheduler-scheduler" Oct 07 13:19:16 crc kubenswrapper[4959]: E1007 13:19:16.056154 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-api" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056160 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-api" Oct 07 13:19:16 crc kubenswrapper[4959]: E1007 13:19:16.056172 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-log" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056178 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-log" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056344 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-api" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056354 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" containerName="nova-api-log" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.056370 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" containerName="nova-scheduler-scheduler" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.057024 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.061103 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.066887 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.072402 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd1506f-f443-4edf-8a68-ae3f2228ebbc" (UID: "8dd1506f-f443-4edf-8a68-ae3f2228ebbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.072981 4959 scope.go:117] "RemoveContainer" containerID="e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.090162 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.111341 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.111399 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zwz\" (UniqueName: \"kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.111452 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.111505 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.111516 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd1506f-f443-4edf-8a68-ae3f2228ebbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.132298 4959 scope.go:117] "RemoveContainer" containerID="f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe" Oct 07 13:19:16 crc kubenswrapper[4959]: E1007 13:19:16.133028 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe\": container with ID starting with f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe not found: ID does not exist" containerID="f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.133085 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe"} err="failed to get container status \"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe\": rpc error: code = NotFound desc = could not find container \"f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe\": container with ID starting with f7d05c24c3823337f6b4e0a17d8427ddbf1c03e5a82ea445ef100cd65e126efe not found: ID does not exist" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.133120 4959 scope.go:117] "RemoveContainer" containerID="e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543" Oct 07 13:19:16 crc kubenswrapper[4959]: E1007 13:19:16.133588 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543\": container with ID starting with e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543 not found: ID does not exist" containerID="e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.133652 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543"} err="failed to get container status \"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543\": rpc error: code = NotFound desc = could not find container \"e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543\": container with ID starting with e29627e377634ddd0bc2f50df3c4f2f65d751b47156293b405d7eae4e756b543 not found: ID does not exist" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.212815 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.213269 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.213527 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.213770 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.213837 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zwz\" (UniqueName: \"kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.224795 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.224978 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.229213 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zwz\" (UniqueName: \"kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz\") pod \"nova-scheduler-0\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.317779 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.336702 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.344456 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.346099 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.351496 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.361207 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.373467 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.417892 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.418254 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lccwr\" (UniqueName: \"kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.418330 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.418353 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.519951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lccwr\" (UniqueName: \"kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.520049 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.520077 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.520159 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.520672 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.526490 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.551807 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.554227 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lccwr\" (UniqueName: \"kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr\") pod \"nova-api-0\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.670998 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:16 crc kubenswrapper[4959]: W1007 13:19:16.821234 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3271361_342b_4173_a261_884f63259112.slice/crio-311f8dd52c688a1a6577478d6529e5bd72588dc330750b07dbbd4ce8cdcbf617 WatchSource:0}: Error finding container 311f8dd52c688a1a6577478d6529e5bd72588dc330750b07dbbd4ce8cdcbf617: Status 404 returned error can't find the container with id 311f8dd52c688a1a6577478d6529e5bd72588dc330750b07dbbd4ce8cdcbf617 Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.824359 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f82f728-0fae-4cfd-8d4b-79c04e3b8b54" path="/var/lib/kubelet/pods/3f82f728-0fae-4cfd-8d4b-79c04e3b8b54/volumes" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.825109 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd1506f-f443-4edf-8a68-ae3f2228ebbc" path="/var/lib/kubelet/pods/8dd1506f-f443-4edf-8a68-ae3f2228ebbc/volumes" Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.825819 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.975908 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3271361-342b-4173-a261-884f63259112","Type":"ContainerStarted","Data":"311f8dd52c688a1a6577478d6529e5bd72588dc330750b07dbbd4ce8cdcbf617"} Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.982045 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerStarted","Data":"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8"} Oct 07 13:19:16 crc kubenswrapper[4959]: I1007 13:19:16.982095 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.003912 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.82439332 podStartE2EDuration="10.003891427s" podCreationTimestamp="2025-10-07 13:19:07 +0000 UTC" firstStartedPulling="2025-10-07 13:19:08.582159902 +0000 UTC m=+1100.742882579" lastFinishedPulling="2025-10-07 13:19:16.761658009 +0000 UTC m=+1108.922380686" observedRunningTime="2025-10-07 13:19:16.998742546 +0000 UTC m=+1109.159465223" watchObservedRunningTime="2025-10-07 13:19:17.003891427 +0000 UTC m=+1109.164614094" Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.094350 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:17 crc kubenswrapper[4959]: W1007 13:19:17.094648 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod763a8b33_d767_44c1_9f77_9f0c97b88490.slice/crio-fbaa74eba94e013bb0d9ece21b56465a24d4de883f8b46d73d92749363a80c91 WatchSource:0}: Error finding container fbaa74eba94e013bb0d9ece21b56465a24d4de883f8b46d73d92749363a80c91: Status 404 returned error can't find the container with id fbaa74eba94e013bb0d9ece21b56465a24d4de883f8b46d73d92749363a80c91 Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.990988 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerStarted","Data":"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67"} Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.991317 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerStarted","Data":"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354"} Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.991329 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerStarted","Data":"fbaa74eba94e013bb0d9ece21b56465a24d4de883f8b46d73d92749363a80c91"} Oct 07 13:19:17 crc kubenswrapper[4959]: I1007 13:19:17.993658 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3271361-342b-4173-a261-884f63259112","Type":"ContainerStarted","Data":"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25"} Oct 07 13:19:18 crc kubenswrapper[4959]: I1007 13:19:18.019009 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.01898502 podStartE2EDuration="2.01898502s" podCreationTimestamp="2025-10-07 13:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:18.01148451 +0000 UTC m=+1110.172207227" watchObservedRunningTime="2025-10-07 13:19:18.01898502 +0000 UTC m=+1110.179707697" Oct 07 13:19:18 crc kubenswrapper[4959]: I1007 13:19:18.037927 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.037904563 podStartE2EDuration="2.037904563s" podCreationTimestamp="2025-10-07 13:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:18.030202788 +0000 UTC m=+1110.190925475" watchObservedRunningTime="2025-10-07 13:19:18.037904563 +0000 UTC m=+1110.198627240" Oct 07 13:19:21 crc kubenswrapper[4959]: I1007 13:19:21.213423 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:19:21 crc kubenswrapper[4959]: I1007 13:19:21.213712 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:19:21 crc kubenswrapper[4959]: I1007 13:19:21.373664 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:19:22 crc kubenswrapper[4959]: I1007 13:19:22.224876 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:19:22 crc kubenswrapper[4959]: I1007 13:19:22.224945 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:19:26 crc kubenswrapper[4959]: I1007 13:19:26.374608 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 13:19:26 crc kubenswrapper[4959]: I1007 13:19:26.404045 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 13:19:26 crc kubenswrapper[4959]: I1007 13:19:26.671747 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:19:26 crc kubenswrapper[4959]: I1007 13:19:26.671801 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:19:27 crc kubenswrapper[4959]: I1007 13:19:27.092315 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 13:19:27 crc kubenswrapper[4959]: I1007 13:19:27.753822 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:19:27 crc kubenswrapper[4959]: I1007 13:19:27.753822 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:19:31 crc kubenswrapper[4959]: I1007 13:19:31.219307 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:19:31 crc kubenswrapper[4959]: I1007 13:19:31.223125 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:19:31 crc kubenswrapper[4959]: I1007 13:19:31.235510 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:19:32 crc kubenswrapper[4959]: I1007 13:19:32.132697 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.087085 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.141606 4959 generic.go:334] "Generic (PLEG): container finished" podID="768644fb-d5ea-43ba-8277-7864670945ec" containerID="d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800" exitCode=137 Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.141697 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"768644fb-d5ea-43ba-8277-7864670945ec","Type":"ContainerDied","Data":"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800"} Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.141667 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.141755 4959 scope.go:117] "RemoveContainer" containerID="d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.141740 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"768644fb-d5ea-43ba-8277-7864670945ec","Type":"ContainerDied","Data":"f08bcbaed10a72aa14e335c9c63f7650e5afbecb33f26ecbfe9802a4737bb8ea"} Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.160814 4959 scope.go:117] "RemoveContainer" containerID="d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800" Oct 07 13:19:34 crc kubenswrapper[4959]: E1007 13:19:34.161138 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800\": container with ID starting with d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800 not found: ID does not exist" containerID="d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.161171 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800"} err="failed to get container status \"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800\": rpc error: code = NotFound desc = could not find container \"d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800\": container with ID starting with d58a8b83cf70cb0d8a1cdc3d56590d03ec993d939bdac4960c92b29705623800 not found: ID does not exist" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.169882 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle\") pod \"768644fb-d5ea-43ba-8277-7864670945ec\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.170032 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf\") pod \"768644fb-d5ea-43ba-8277-7864670945ec\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.170198 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data\") pod \"768644fb-d5ea-43ba-8277-7864670945ec\" (UID: \"768644fb-d5ea-43ba-8277-7864670945ec\") " Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.184924 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf" (OuterVolumeSpecName: "kube-api-access-qnkjf") pod "768644fb-d5ea-43ba-8277-7864670945ec" (UID: "768644fb-d5ea-43ba-8277-7864670945ec"). InnerVolumeSpecName "kube-api-access-qnkjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.194754 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "768644fb-d5ea-43ba-8277-7864670945ec" (UID: "768644fb-d5ea-43ba-8277-7864670945ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.196825 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data" (OuterVolumeSpecName: "config-data") pod "768644fb-d5ea-43ba-8277-7864670945ec" (UID: "768644fb-d5ea-43ba-8277-7864670945ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.272662 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/768644fb-d5ea-43ba-8277-7864670945ec-kube-api-access-qnkjf\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.272695 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.272706 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768644fb-d5ea-43ba-8277-7864670945ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.474447 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.488474 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.526324 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:34 crc kubenswrapper[4959]: E1007 13:19:34.526886 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768644fb-d5ea-43ba-8277-7864670945ec" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.526907 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="768644fb-d5ea-43ba-8277-7864670945ec" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.527137 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="768644fb-d5ea-43ba-8277-7864670945ec" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.527931 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.530863 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.531126 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.537981 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.584097 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99bg\" (UniqueName: \"kubernetes.io/projected/9addbd40-1800-4967-bb06-7a90697034dd-kube-api-access-x99bg\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.584249 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.584290 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.584330 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.584388 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.620762 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.687486 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99bg\" (UniqueName: \"kubernetes.io/projected/9addbd40-1800-4967-bb06-7a90697034dd-kube-api-access-x99bg\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.687610 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.687657 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.687715 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.687758 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.693096 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.693892 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.693995 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.706397 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99bg\" (UniqueName: \"kubernetes.io/projected/9addbd40-1800-4967-bb06-7a90697034dd-kube-api-access-x99bg\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.711756 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9addbd40-1800-4967-bb06-7a90697034dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9addbd40-1800-4967-bb06-7a90697034dd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.818414 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768644fb-d5ea-43ba-8277-7864670945ec" path="/var/lib/kubelet/pods/768644fb-d5ea-43ba-8277-7864670945ec/volumes" Oct 07 13:19:34 crc kubenswrapper[4959]: I1007 13:19:34.854192 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:35 crc kubenswrapper[4959]: I1007 13:19:35.330370 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:19:35 crc kubenswrapper[4959]: W1007 13:19:35.336887 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9addbd40_1800_4967_bb06_7a90697034dd.slice/crio-8b0bee316b2bbbb1ade597605e1f15ad0892cc8f8a474a9640777215c0cddde3 WatchSource:0}: Error finding container 8b0bee316b2bbbb1ade597605e1f15ad0892cc8f8a474a9640777215c0cddde3: Status 404 returned error can't find the container with id 8b0bee316b2bbbb1ade597605e1f15ad0892cc8f8a474a9640777215c0cddde3 Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.161208 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9addbd40-1800-4967-bb06-7a90697034dd","Type":"ContainerStarted","Data":"47cb476bb4946bb27377947d5814c46402aeed3b9f8a50bf2a289312b6f83738"} Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.161516 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9addbd40-1800-4967-bb06-7a90697034dd","Type":"ContainerStarted","Data":"8b0bee316b2bbbb1ade597605e1f15ad0892cc8f8a474a9640777215c0cddde3"} Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.195172 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.195150395 podStartE2EDuration="2.195150395s" podCreationTimestamp="2025-10-07 13:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:36.183955068 +0000 UTC m=+1128.344677745" watchObservedRunningTime="2025-10-07 13:19:36.195150395 +0000 UTC m=+1128.355873072" Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.674916 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.675815 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.676325 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:19:36 crc kubenswrapper[4959]: I1007 13:19:36.683880 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.172990 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.176742 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.348220 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.353226 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.362783 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.438421 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hwx\" (UniqueName: \"kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.439255 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.439446 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.439616 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.439765 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.541743 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hwx\" (UniqueName: \"kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.542073 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.542216 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.542333 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.542436 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.543001 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.543569 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.543813 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.544113 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.562785 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hwx\" (UniqueName: \"kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx\") pod \"dnsmasq-dns-869677f947-z6bhx\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.696091 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.696151 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:19:37 crc kubenswrapper[4959]: I1007 13:19:37.709351 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:38 crc kubenswrapper[4959]: I1007 13:19:38.204166 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:19:38 crc kubenswrapper[4959]: I1007 13:19:38.249736 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.190049 4959 generic.go:334] "Generic (PLEG): container finished" podID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerID="97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf" exitCode=0 Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.192051 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869677f947-z6bhx" event={"ID":"25aebe9e-8937-40d0-bb85-e057e6b79778","Type":"ContainerDied","Data":"97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf"} Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.192083 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869677f947-z6bhx" event={"ID":"25aebe9e-8937-40d0-bb85-e057e6b79778","Type":"ContainerStarted","Data":"b562473c1dbec9c2a8cf9073e41a7ff2d61be84b2a4a98c67505bc3734777197"} Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.597552 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.598882 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-central-agent" containerID="cri-o://d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f" gracePeriod=30 Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.598923 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="proxy-httpd" containerID="cri-o://2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8" gracePeriod=30 Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.598966 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-notification-agent" containerID="cri-o://33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484" gracePeriod=30 Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.598953 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="sg-core" containerID="cri-o://961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3" gracePeriod=30 Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.854658 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:39 crc kubenswrapper[4959]: I1007 13:19:39.963957 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200135 4959 generic.go:334] "Generic (PLEG): container finished" podID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerID="2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8" exitCode=0 Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200174 4959 generic.go:334] "Generic (PLEG): container finished" podID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerID="961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3" exitCode=2 Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200182 4959 generic.go:334] "Generic (PLEG): container finished" podID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerID="d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f" exitCode=0 Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200222 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerDied","Data":"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8"} Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200249 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerDied","Data":"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3"} Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.200259 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerDied","Data":"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f"} Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.201954 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-log" containerID="cri-o://d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354" gracePeriod=30 Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.202544 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869677f947-z6bhx" event={"ID":"25aebe9e-8937-40d0-bb85-e057e6b79778","Type":"ContainerStarted","Data":"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553"} Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.202814 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-api" containerID="cri-o://b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67" gracePeriod=30 Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.202951 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:40 crc kubenswrapper[4959]: I1007 13:19:40.232217 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869677f947-z6bhx" podStartSLOduration=3.232196846 podStartE2EDuration="3.232196846s" podCreationTimestamp="2025-10-07 13:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:40.226926302 +0000 UTC m=+1132.387648989" watchObservedRunningTime="2025-10-07 13:19:40.232196846 +0000 UTC m=+1132.392919533" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.153500 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210508 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210603 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210621 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210760 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210789 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfg4j\" (UniqueName: \"kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210809 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210871 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.210887 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle\") pod \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\" (UID: \"6a44aa4c-74c8-4334-b63e-1f845950d1c1\") " Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.211293 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.211652 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.221419 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j" (OuterVolumeSpecName: "kube-api-access-zfg4j") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "kube-api-access-zfg4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.226909 4959 generic.go:334] "Generic (PLEG): container finished" podID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerID="d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354" exitCode=143 Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.227011 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerDied","Data":"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354"} Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.231909 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts" (OuterVolumeSpecName: "scripts") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.234657 4959 generic.go:334] "Generic (PLEG): container finished" podID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerID="33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484" exitCode=0 Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.234742 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerDied","Data":"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484"} Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.234776 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a44aa4c-74c8-4334-b63e-1f845950d1c1","Type":"ContainerDied","Data":"649e148dd93ec2aefcaffb40faf03c46bc4347968e0f9ebd099b0465e39ec34d"} Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.234798 4959 scope.go:117] "RemoveContainer" containerID="2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.234725 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.256816 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.274260 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314087 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314121 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a44aa4c-74c8-4334-b63e-1f845950d1c1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314133 4959 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314147 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314158 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfg4j\" (UniqueName: \"kubernetes.io/projected/6a44aa4c-74c8-4334-b63e-1f845950d1c1-kube-api-access-zfg4j\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.314169 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.330000 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.368496 4959 scope.go:117] "RemoveContainer" containerID="961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.380118 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data" (OuterVolumeSpecName: "config-data") pod "6a44aa4c-74c8-4334-b63e-1f845950d1c1" (UID: "6a44aa4c-74c8-4334-b63e-1f845950d1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.388595 4959 scope.go:117] "RemoveContainer" containerID="33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.416098 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.416136 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a44aa4c-74c8-4334-b63e-1f845950d1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.427154 4959 scope.go:117] "RemoveContainer" containerID="d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.456523 4959 scope.go:117] "RemoveContainer" containerID="2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.457053 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8\": container with ID starting with 2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8 not found: ID does not exist" containerID="2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457089 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8"} err="failed to get container status \"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8\": rpc error: code = NotFound desc = could not find container \"2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8\": container with ID starting with 2f503cf493611d47659bc7d22ae982160a170a248f494186d8e9bdf4b12783b8 not found: ID does not exist" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457115 4959 scope.go:117] "RemoveContainer" containerID="961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.457431 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3\": container with ID starting with 961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3 not found: ID does not exist" containerID="961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457450 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3"} err="failed to get container status \"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3\": rpc error: code = NotFound desc = could not find container \"961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3\": container with ID starting with 961923ea342d9f95b240ea4ba63bd0b933daf3e283e047ad12f38d528b5bd5a3 not found: ID does not exist" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457461 4959 scope.go:117] "RemoveContainer" containerID="33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.457905 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484\": container with ID starting with 33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484 not found: ID does not exist" containerID="33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457934 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484"} err="failed to get container status \"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484\": rpc error: code = NotFound desc = could not find container \"33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484\": container with ID starting with 33449e0ae5fa7bcc74530c31e4f695d4b718f4d422cdd79f4b5a83df5f537484 not found: ID does not exist" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.457947 4959 scope.go:117] "RemoveContainer" containerID="d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.458363 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f\": container with ID starting with d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f not found: ID does not exist" containerID="d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.458380 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f"} err="failed to get container status \"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f\": rpc error: code = NotFound desc = could not find container \"d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f\": container with ID starting with d63d601d901c133886eb63361704de8e3a133cc75016e81d7fc317ed50f8d65f not found: ID does not exist" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.565058 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.573131 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.594496 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.594919 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="sg-core" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.594940 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="sg-core" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.594951 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-notification-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.594957 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-notification-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.594980 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-central-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.594986 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-central-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.595001 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="proxy-httpd" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.595007 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="proxy-httpd" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.595164 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-notification-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.595177 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="ceilometer-central-agent" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.595189 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="proxy-httpd" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.595195 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" containerName="sg-core" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.596851 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.599120 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.602457 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.604185 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.607270 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.631242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.635066 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.635400 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.635587 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.635702 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.635938 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvqv\" (UniqueName: \"kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.636044 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.636155 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.653134 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:41 crc kubenswrapper[4959]: E1007 13:19:41.656110 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-zwvqv log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="37f2af7b-e102-44d0-904d-b87dade4aad0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.737507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.737570 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.737603 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738102 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738159 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738183 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738249 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvqv\" (UniqueName: \"kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738269 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.738753 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.739559 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.742157 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.742196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.742440 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.742598 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.743328 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:41 crc kubenswrapper[4959]: I1007 13:19:41.756783 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvqv\" (UniqueName: \"kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv\") pod \"ceilometer-0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " pod="openstack/ceilometer-0" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.243298 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.252350 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349128 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwvqv\" (UniqueName: \"kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349241 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349296 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349319 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349872 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349897 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.349981 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.350004 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd\") pod \"37f2af7b-e102-44d0-904d-b87dade4aad0\" (UID: \"37f2af7b-e102-44d0-904d-b87dade4aad0\") " Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.350675 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.350858 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.356527 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.356585 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.356608 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts" (OuterVolumeSpecName: "scripts") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.356678 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.361243 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv" (OuterVolumeSpecName: "kube-api-access-zwvqv") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "kube-api-access-zwvqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.367846 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data" (OuterVolumeSpecName: "config-data") pod "37f2af7b-e102-44d0-904d-b87dade4aad0" (UID: "37f2af7b-e102-44d0-904d-b87dade4aad0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452615 4959 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452940 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452949 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452959 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37f2af7b-e102-44d0-904d-b87dade4aad0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452967 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwvqv\" (UniqueName: \"kubernetes.io/projected/37f2af7b-e102-44d0-904d-b87dade4aad0-kube-api-access-zwvqv\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452983 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452992 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.452999 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f2af7b-e102-44d0-904d-b87dade4aad0-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:42 crc kubenswrapper[4959]: I1007 13:19:42.821017 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a44aa4c-74c8-4334-b63e-1f845950d1c1" path="/var/lib/kubelet/pods/6a44aa4c-74c8-4334-b63e-1f845950d1c1/volumes" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.249774 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.293033 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.301884 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.318904 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.321183 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.328605 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.328952 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.328967 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.341289 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370012 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370058 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370089 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370114 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370308 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wz5\" (UniqueName: \"kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370370 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370582 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.370738 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472083 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472129 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472160 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472187 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472221 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wz5\" (UniqueName: \"kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472239 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472319 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472385 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.472745 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.473136 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.476031 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.476446 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.476901 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.478493 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.478790 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.490332 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wz5\" (UniqueName: \"kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5\") pod \"ceilometer-0\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.652232 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.832445 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.881226 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data\") pod \"763a8b33-d767-44c1-9f77-9f0c97b88490\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.881348 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs\") pod \"763a8b33-d767-44c1-9f77-9f0c97b88490\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.881400 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle\") pod \"763a8b33-d767-44c1-9f77-9f0c97b88490\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.881584 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lccwr\" (UniqueName: \"kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr\") pod \"763a8b33-d767-44c1-9f77-9f0c97b88490\" (UID: \"763a8b33-d767-44c1-9f77-9f0c97b88490\") " Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.887052 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs" (OuterVolumeSpecName: "logs") pod "763a8b33-d767-44c1-9f77-9f0c97b88490" (UID: "763a8b33-d767-44c1-9f77-9f0c97b88490"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.896550 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr" (OuterVolumeSpecName: "kube-api-access-lccwr") pod "763a8b33-d767-44c1-9f77-9f0c97b88490" (UID: "763a8b33-d767-44c1-9f77-9f0c97b88490"). InnerVolumeSpecName "kube-api-access-lccwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.931729 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "763a8b33-d767-44c1-9f77-9f0c97b88490" (UID: "763a8b33-d767-44c1-9f77-9f0c97b88490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.940857 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data" (OuterVolumeSpecName: "config-data") pod "763a8b33-d767-44c1-9f77-9f0c97b88490" (UID: "763a8b33-d767-44c1-9f77-9f0c97b88490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.989769 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lccwr\" (UniqueName: \"kubernetes.io/projected/763a8b33-d767-44c1-9f77-9f0c97b88490-kube-api-access-lccwr\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.989804 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.989816 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763a8b33-d767-44c1-9f77-9f0c97b88490-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:43 crc kubenswrapper[4959]: I1007 13:19:43.989824 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763a8b33-d767-44c1-9f77-9f0c97b88490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.199798 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:19:44 crc kubenswrapper[4959]: W1007 13:19:44.204451 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5e1f006_c462_46be_b191_aab45dd3d1b7.slice/crio-0b7da3a18f1781d138dcb60e6b86254b90d558a04debb085d77979fa5bfe162f WatchSource:0}: Error finding container 0b7da3a18f1781d138dcb60e6b86254b90d558a04debb085d77979fa5bfe162f: Status 404 returned error can't find the container with id 0b7da3a18f1781d138dcb60e6b86254b90d558a04debb085d77979fa5bfe162f Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.260526 4959 generic.go:334] "Generic (PLEG): container finished" podID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerID="b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67" exitCode=0 Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.260597 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.260609 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerDied","Data":"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67"} Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.260749 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763a8b33-d767-44c1-9f77-9f0c97b88490","Type":"ContainerDied","Data":"fbaa74eba94e013bb0d9ece21b56465a24d4de883f8b46d73d92749363a80c91"} Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.260784 4959 scope.go:117] "RemoveContainer" containerID="b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.262292 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerStarted","Data":"0b7da3a18f1781d138dcb60e6b86254b90d558a04debb085d77979fa5bfe162f"} Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.286226 4959 scope.go:117] "RemoveContainer" containerID="d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.320846 4959 scope.go:117] "RemoveContainer" containerID="b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67" Oct 07 13:19:44 crc kubenswrapper[4959]: E1007 13:19:44.321250 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67\": container with ID starting with b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67 not found: ID does not exist" containerID="b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.321284 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67"} err="failed to get container status \"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67\": rpc error: code = NotFound desc = could not find container \"b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67\": container with ID starting with b505043180478dae205ff5fd6035e8eb6d68d2d3f7602d5cba506f5fe3f37a67 not found: ID does not exist" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.321308 4959 scope.go:117] "RemoveContainer" containerID="d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354" Oct 07 13:19:44 crc kubenswrapper[4959]: E1007 13:19:44.321530 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354\": container with ID starting with d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354 not found: ID does not exist" containerID="d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.321561 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354"} err="failed to get container status \"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354\": rpc error: code = NotFound desc = could not find container \"d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354\": container with ID starting with d00f48d09ee2fafea68a420687043f9e5fc7494c2d9d90a3e1fa696e6d4d3354 not found: ID does not exist" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.325822 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.335544 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.356353 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:44 crc kubenswrapper[4959]: E1007 13:19:44.356855 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-log" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.356873 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-log" Oct 07 13:19:44 crc kubenswrapper[4959]: E1007 13:19:44.356895 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-api" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.356902 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-api" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.357089 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-api" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.357101 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" containerName="nova-api-log" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.358107 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.361165 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.361164 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.361319 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.364596 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.399829 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.399887 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.399916 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.399951 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x422l\" (UniqueName: \"kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.400007 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.400070 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502659 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502747 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502776 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502803 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502837 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x422l\" (UniqueName: \"kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.502889 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.503410 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.507430 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.508239 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.508856 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.521435 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.521950 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x422l\" (UniqueName: \"kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l\") pod \"nova-api-0\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.686434 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.824842 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f2af7b-e102-44d0-904d-b87dade4aad0" path="/var/lib/kubelet/pods/37f2af7b-e102-44d0-904d-b87dade4aad0/volumes" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.826575 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763a8b33-d767-44c1-9f77-9f0c97b88490" path="/var/lib/kubelet/pods/763a8b33-d767-44c1-9f77-9f0c97b88490/volumes" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.856561 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:44 crc kubenswrapper[4959]: I1007 13:19:44.886414 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.197474 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:45 crc kubenswrapper[4959]: W1007 13:19:45.201033 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9df54ad_6d66_4f72_a490_c544998b2a5c.slice/crio-9fe7358fda6a378d20f4d28e6a4ca3e273a1a95fea9d91f93979a24a3bb61404 WatchSource:0}: Error finding container 9fe7358fda6a378d20f4d28e6a4ca3e273a1a95fea9d91f93979a24a3bb61404: Status 404 returned error can't find the container with id 9fe7358fda6a378d20f4d28e6a4ca3e273a1a95fea9d91f93979a24a3bb61404 Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.281095 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerStarted","Data":"051b7ddb5928aaeabb43744b419252997d6acc6334fd0f07a1cf59d37a66edc8"} Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.283651 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerStarted","Data":"9fe7358fda6a378d20f4d28e6a4ca3e273a1a95fea9d91f93979a24a3bb61404"} Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.298363 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.572250 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-clpnc"] Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.573972 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.577273 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.577289 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.579638 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-clpnc"] Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.668863 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6rll\" (UniqueName: \"kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.668936 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.668979 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.668997 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.770394 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6rll\" (UniqueName: \"kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.770921 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.770980 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.771005 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.775942 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.777688 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.783155 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.792435 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6rll\" (UniqueName: \"kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll\") pod \"nova-cell1-cell-mapping-clpnc\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:45 crc kubenswrapper[4959]: I1007 13:19:45.922402 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:46 crc kubenswrapper[4959]: I1007 13:19:46.291528 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerStarted","Data":"54adc4721be4ea98501204f14cc21a9c60caeafb9d524352b9470c10d9d1efef"} Oct 07 13:19:46 crc kubenswrapper[4959]: I1007 13:19:46.291990 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerStarted","Data":"389f7019b12cf8f16c0d27ea65d893bae3f89e56c148ca3fe93572ce8ec2eb42"} Oct 07 13:19:46 crc kubenswrapper[4959]: I1007 13:19:46.294805 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerStarted","Data":"04a381b3bf039da37374850b0bddb599b4be36decc14cca6e1825ff7a2aececb"} Oct 07 13:19:46 crc kubenswrapper[4959]: I1007 13:19:46.315161 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.315145034 podStartE2EDuration="2.315145034s" podCreationTimestamp="2025-10-07 13:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:46.307073498 +0000 UTC m=+1138.467796185" watchObservedRunningTime="2025-10-07 13:19:46.315145034 +0000 UTC m=+1138.475867711" Oct 07 13:19:46 crc kubenswrapper[4959]: I1007 13:19:46.348006 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-clpnc"] Oct 07 13:19:46 crc kubenswrapper[4959]: W1007 13:19:46.349416 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b13bdd1_f729_4a5a_bef3_4587cdac360f.slice/crio-7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764 WatchSource:0}: Error finding container 7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764: Status 404 returned error can't find the container with id 7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764 Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.344738 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerStarted","Data":"289627a5d6d5996a028bb6befef0e7a2931e7337f456bf41ae79cde2cc8c3c13"} Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.376655 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-clpnc" event={"ID":"4b13bdd1-f729-4a5a-bef3-4587cdac360f","Type":"ContainerStarted","Data":"6566960a178f02f7dd39576384b50412d61e11bffcdd9cf1d8bf580a02c87d59"} Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.376696 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-clpnc" event={"ID":"4b13bdd1-f729-4a5a-bef3-4587cdac360f","Type":"ContainerStarted","Data":"7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764"} Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.415307 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-clpnc" podStartSLOduration=2.415289815 podStartE2EDuration="2.415289815s" podCreationTimestamp="2025-10-07 13:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:47.411988869 +0000 UTC m=+1139.572711546" watchObservedRunningTime="2025-10-07 13:19:47.415289815 +0000 UTC m=+1139.576012492" Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.710852 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.782370 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:19:47 crc kubenswrapper[4959]: I1007 13:19:47.782655 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="dnsmasq-dns" containerID="cri-o://d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947" gracePeriod=10 Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.273255 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.323242 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config\") pod \"754c17de-0a0b-4307-83b8-52cec2996433\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.323437 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb\") pod \"754c17de-0a0b-4307-83b8-52cec2996433\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.323480 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb\") pod \"754c17de-0a0b-4307-83b8-52cec2996433\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.323663 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84t5b\" (UniqueName: \"kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b\") pod \"754c17de-0a0b-4307-83b8-52cec2996433\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.323697 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc\") pod \"754c17de-0a0b-4307-83b8-52cec2996433\" (UID: \"754c17de-0a0b-4307-83b8-52cec2996433\") " Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.348812 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b" (OuterVolumeSpecName: "kube-api-access-84t5b") pod "754c17de-0a0b-4307-83b8-52cec2996433" (UID: "754c17de-0a0b-4307-83b8-52cec2996433"). InnerVolumeSpecName "kube-api-access-84t5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.371528 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "754c17de-0a0b-4307-83b8-52cec2996433" (UID: "754c17de-0a0b-4307-83b8-52cec2996433"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.378007 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config" (OuterVolumeSpecName: "config") pod "754c17de-0a0b-4307-83b8-52cec2996433" (UID: "754c17de-0a0b-4307-83b8-52cec2996433"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.385892 4959 generic.go:334] "Generic (PLEG): container finished" podID="754c17de-0a0b-4307-83b8-52cec2996433" containerID="d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947" exitCode=0 Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.385968 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" event={"ID":"754c17de-0a0b-4307-83b8-52cec2996433","Type":"ContainerDied","Data":"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947"} Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.386026 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" event={"ID":"754c17de-0a0b-4307-83b8-52cec2996433","Type":"ContainerDied","Data":"e6ae1108a6d4f316776415a430bd16a88fe8a23d1d043ad47932be47c04a2fd3"} Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.386043 4959 scope.go:117] "RemoveContainer" containerID="d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.385987 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54974c8ff5-2tnwp" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.386115 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "754c17de-0a0b-4307-83b8-52cec2996433" (UID: "754c17de-0a0b-4307-83b8-52cec2996433"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.390372 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "754c17de-0a0b-4307-83b8-52cec2996433" (UID: "754c17de-0a0b-4307-83b8-52cec2996433"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.392701 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerStarted","Data":"c65d12018f1beac17f92afa2e2e6418f6f617507dfac1720c2f32dfa96d80c95"} Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.392792 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.426022 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.426050 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.426060 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.426069 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84t5b\" (UniqueName: \"kubernetes.io/projected/754c17de-0a0b-4307-83b8-52cec2996433-kube-api-access-84t5b\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.426077 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754c17de-0a0b-4307-83b8-52cec2996433-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.433833 4959 scope.go:117] "RemoveContainer" containerID="ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.451181 4959 scope.go:117] "RemoveContainer" containerID="d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947" Oct 07 13:19:48 crc kubenswrapper[4959]: E1007 13:19:48.452096 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947\": container with ID starting with d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947 not found: ID does not exist" containerID="d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.452145 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947"} err="failed to get container status \"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947\": rpc error: code = NotFound desc = could not find container \"d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947\": container with ID starting with d55eef0f0de84169082d1dc5285e518e2252d1d6db91524667272a9350938947 not found: ID does not exist" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.452174 4959 scope.go:117] "RemoveContainer" containerID="ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c" Oct 07 13:19:48 crc kubenswrapper[4959]: E1007 13:19:48.452479 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c\": container with ID starting with ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c not found: ID does not exist" containerID="ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.452510 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c"} err="failed to get container status \"ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c\": rpc error: code = NotFound desc = could not find container \"ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c\": container with ID starting with ed68be46a4994de60a5904814885a583779d8034c93f7d379143229952c9307c not found: ID does not exist" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.717233 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.922308667 podStartE2EDuration="5.717211882s" podCreationTimestamp="2025-10-07 13:19:43 +0000 UTC" firstStartedPulling="2025-10-07 13:19:44.206763339 +0000 UTC m=+1136.367486016" lastFinishedPulling="2025-10-07 13:19:48.001666554 +0000 UTC m=+1140.162389231" observedRunningTime="2025-10-07 13:19:48.419710637 +0000 UTC m=+1140.580433324" watchObservedRunningTime="2025-10-07 13:19:48.717211882 +0000 UTC m=+1140.877934569" Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.717594 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.729433 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54974c8ff5-2tnwp"] Oct 07 13:19:48 crc kubenswrapper[4959]: I1007 13:19:48.860043 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754c17de-0a0b-4307-83b8-52cec2996433" path="/var/lib/kubelet/pods/754c17de-0a0b-4307-83b8-52cec2996433/volumes" Oct 07 13:19:52 crc kubenswrapper[4959]: I1007 13:19:52.441886 4959 generic.go:334] "Generic (PLEG): container finished" podID="4b13bdd1-f729-4a5a-bef3-4587cdac360f" containerID="6566960a178f02f7dd39576384b50412d61e11bffcdd9cf1d8bf580a02c87d59" exitCode=0 Oct 07 13:19:52 crc kubenswrapper[4959]: I1007 13:19:52.441960 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-clpnc" event={"ID":"4b13bdd1-f729-4a5a-bef3-4587cdac360f","Type":"ContainerDied","Data":"6566960a178f02f7dd39576384b50412d61e11bffcdd9cf1d8bf580a02c87d59"} Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.765521 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.837426 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts\") pod \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.837682 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle\") pod \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.837829 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data\") pod \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.837869 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6rll\" (UniqueName: \"kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll\") pod \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\" (UID: \"4b13bdd1-f729-4a5a-bef3-4587cdac360f\") " Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.843244 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts" (OuterVolumeSpecName: "scripts") pod "4b13bdd1-f729-4a5a-bef3-4587cdac360f" (UID: "4b13bdd1-f729-4a5a-bef3-4587cdac360f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.849402 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll" (OuterVolumeSpecName: "kube-api-access-f6rll") pod "4b13bdd1-f729-4a5a-bef3-4587cdac360f" (UID: "4b13bdd1-f729-4a5a-bef3-4587cdac360f"). InnerVolumeSpecName "kube-api-access-f6rll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.864533 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data" (OuterVolumeSpecName: "config-data") pod "4b13bdd1-f729-4a5a-bef3-4587cdac360f" (UID: "4b13bdd1-f729-4a5a-bef3-4587cdac360f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.868748 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b13bdd1-f729-4a5a-bef3-4587cdac360f" (UID: "4b13bdd1-f729-4a5a-bef3-4587cdac360f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.940346 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.940381 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.940390 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6rll\" (UniqueName: \"kubernetes.io/projected/4b13bdd1-f729-4a5a-bef3-4587cdac360f-kube-api-access-f6rll\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:53 crc kubenswrapper[4959]: I1007 13:19:53.940400 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b13bdd1-f729-4a5a-bef3-4587cdac360f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.460158 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-clpnc" event={"ID":"4b13bdd1-f729-4a5a-bef3-4587cdac360f","Type":"ContainerDied","Data":"7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764"} Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.460195 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a1d822d77dd58b11c8f3509fe5e4f207ad8af13ccdf9e6c7bc2262e6d71b764" Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.460209 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-clpnc" Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.662139 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.662445 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e3271361-342b-4173-a261-884f63259112" containerName="nova-scheduler-scheduler" containerID="cri-o://61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" gracePeriod=30 Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.672370 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.672710 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-log" containerID="cri-o://389f7019b12cf8f16c0d27ea65d893bae3f89e56c148ca3fe93572ce8ec2eb42" gracePeriod=30 Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.672766 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-api" containerID="cri-o://54adc4721be4ea98501204f14cc21a9c60caeafb9d524352b9470c10d9d1efef" gracePeriod=30 Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.679454 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.679695 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" containerID="cri-o://451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63" gracePeriod=30 Oct 07 13:19:54 crc kubenswrapper[4959]: I1007 13:19:54.679817 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" containerID="cri-o://e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b" gracePeriod=30 Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.486333 4959 generic.go:334] "Generic (PLEG): container finished" podID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerID="451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63" exitCode=143 Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.486431 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerDied","Data":"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63"} Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.493545 4959 generic.go:334] "Generic (PLEG): container finished" podID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerID="54adc4721be4ea98501204f14cc21a9c60caeafb9d524352b9470c10d9d1efef" exitCode=0 Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.493575 4959 generic.go:334] "Generic (PLEG): container finished" podID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerID="389f7019b12cf8f16c0d27ea65d893bae3f89e56c148ca3fe93572ce8ec2eb42" exitCode=143 Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.493595 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerDied","Data":"54adc4721be4ea98501204f14cc21a9c60caeafb9d524352b9470c10d9d1efef"} Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.493619 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerDied","Data":"389f7019b12cf8f16c0d27ea65d893bae3f89e56c148ca3fe93572ce8ec2eb42"} Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.902162 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.973651 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.973939 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.974387 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.974426 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.974468 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x422l\" (UniqueName: \"kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.974495 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs\") pod \"f9df54ad-6d66-4f72-a490-c544998b2a5c\" (UID: \"f9df54ad-6d66-4f72-a490-c544998b2a5c\") " Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.975079 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs" (OuterVolumeSpecName: "logs") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:55 crc kubenswrapper[4959]: I1007 13:19:55.980028 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l" (OuterVolumeSpecName: "kube-api-access-x422l") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "kube-api-access-x422l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.002237 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.003720 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data" (OuterVolumeSpecName: "config-data") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.021670 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.030698 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f9df54ad-6d66-4f72-a490-c544998b2a5c" (UID: "f9df54ad-6d66-4f72-a490-c544998b2a5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076134 4959 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076167 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076176 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x422l\" (UniqueName: \"kubernetes.io/projected/f9df54ad-6d66-4f72-a490-c544998b2a5c-kube-api-access-x422l\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076206 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9df54ad-6d66-4f72-a490-c544998b2a5c-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076216 4959 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.076224 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9df54ad-6d66-4f72-a490-c544998b2a5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.376224 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.377826 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.378830 4959 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.378859 4959 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e3271361-342b-4173-a261-884f63259112" containerName="nova-scheduler-scheduler" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.502789 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9df54ad-6d66-4f72-a490-c544998b2a5c","Type":"ContainerDied","Data":"9fe7358fda6a378d20f4d28e6a4ca3e273a1a95fea9d91f93979a24a3bb61404"} Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.502841 4959 scope.go:117] "RemoveContainer" containerID="54adc4721be4ea98501204f14cc21a9c60caeafb9d524352b9470c10d9d1efef" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.502842 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.524946 4959 scope.go:117] "RemoveContainer" containerID="389f7019b12cf8f16c0d27ea65d893bae3f89e56c148ca3fe93572ce8ec2eb42" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.531685 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.540144 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.552546 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.556945 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b13bdd1-f729-4a5a-bef3-4587cdac360f" containerName="nova-manage" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.556972 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b13bdd1-f729-4a5a-bef3-4587cdac360f" containerName="nova-manage" Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.556995 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="dnsmasq-dns" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557001 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="dnsmasq-dns" Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.557021 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-api" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557028 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-api" Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.557040 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="init" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557046 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="init" Oct 07 13:19:56 crc kubenswrapper[4959]: E1007 13:19:56.557060 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-log" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557089 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-log" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557323 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b13bdd1-f729-4a5a-bef3-4587cdac360f" containerName="nova-manage" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557342 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-api" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557356 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" containerName="nova-api-log" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.557370 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="754c17de-0a0b-4307-83b8-52cec2996433" containerName="dnsmasq-dns" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.558249 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.560580 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.560850 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.561006 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.568479 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.687603 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.687685 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-logs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.687720 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.687767 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-public-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.687890 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbk4\" (UniqueName: \"kubernetes.io/projected/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-kube-api-access-pwbk4\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.688100 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-config-data\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791152 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791241 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-logs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791293 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791326 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-public-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791372 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbk4\" (UniqueName: \"kubernetes.io/projected/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-kube-api-access-pwbk4\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791472 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-config-data\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.791900 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-logs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.795131 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-public-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.795211 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.795349 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.803328 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-config-data\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.818382 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbk4\" (UniqueName: \"kubernetes.io/projected/bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e-kube-api-access-pwbk4\") pod \"nova-api-0\" (UID: \"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e\") " pod="openstack/nova-api-0" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.821878 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9df54ad-6d66-4f72-a490-c544998b2a5c" path="/var/lib/kubelet/pods/f9df54ad-6d66-4f72-a490-c544998b2a5c/volumes" Oct 07 13:19:56 crc kubenswrapper[4959]: I1007 13:19:56.874662 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:19:57 crc kubenswrapper[4959]: I1007 13:19:57.299282 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:19:57 crc kubenswrapper[4959]: W1007 13:19:57.309787 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc4aa01_2f10_4f3d_b7d0_fb60132a1c2e.slice/crio-b2f4d258849a9609f593245a2de76929226bcbb0d3f91aa8de08a58463282152 WatchSource:0}: Error finding container b2f4d258849a9609f593245a2de76929226bcbb0d3f91aa8de08a58463282152: Status 404 returned error can't find the container with id b2f4d258849a9609f593245a2de76929226bcbb0d3f91aa8de08a58463282152 Oct 07 13:19:57 crc kubenswrapper[4959]: I1007 13:19:57.512133 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e","Type":"ContainerStarted","Data":"b2f4d258849a9609f593245a2de76929226bcbb0d3f91aa8de08a58463282152"} Oct 07 13:19:57 crc kubenswrapper[4959]: I1007 13:19:57.975255 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:47720->10.217.0.184:8775: read: connection reset by peer" Oct 07 13:19:57 crc kubenswrapper[4959]: I1007 13:19:57.975263 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:47732->10.217.0.184:8775: read: connection reset by peer" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.366105 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.521739 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs\") pod \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.521828 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs\") pod \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.521909 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xth7\" (UniqueName: \"kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7\") pod \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.522004 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data\") pod \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.522131 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle\") pod \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\" (UID: \"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b\") " Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.522735 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs" (OuterVolumeSpecName: "logs") pod "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" (UID: "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.528935 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7" (OuterVolumeSpecName: "kube-api-access-4xth7") pod "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" (UID: "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b"). InnerVolumeSpecName "kube-api-access-4xth7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.540099 4959 generic.go:334] "Generic (PLEG): container finished" podID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerID="e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b" exitCode=0 Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.540151 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.540181 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerDied","Data":"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b"} Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.540213 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ed103adc-f3b4-4fc5-8f29-ca6a511eab7b","Type":"ContainerDied","Data":"3615f8a9841d9c04333f7c02a933c97033d23b51c1961487bd386c070b259768"} Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.540232 4959 scope.go:117] "RemoveContainer" containerID="e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.548509 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e","Type":"ContainerStarted","Data":"e129e4ff3f31387941222cac554948a196c8e8e692d4a434f39e1f8864749aeb"} Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.548554 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e","Type":"ContainerStarted","Data":"8fb89ac9d9da6554564918197ec1292944b3fbe772fe61d1dfb0024dea69154a"} Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.555004 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data" (OuterVolumeSpecName: "config-data") pod "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" (UID: "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.571278 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" (UID: "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.586609 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.586584616 podStartE2EDuration="2.586584616s" podCreationTimestamp="2025-10-07 13:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:19:58.576697907 +0000 UTC m=+1150.737420614" watchObservedRunningTime="2025-10-07 13:19:58.586584616 +0000 UTC m=+1150.747307313" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.603878 4959 scope.go:117] "RemoveContainer" containerID="451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.605450 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" (UID: "ed103adc-f3b4-4fc5-8f29-ca6a511eab7b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.625013 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.625065 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xth7\" (UniqueName: \"kubernetes.io/projected/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-kube-api-access-4xth7\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.625077 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.625086 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.625095 4959 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.642804 4959 scope.go:117] "RemoveContainer" containerID="e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b" Oct 07 13:19:58 crc kubenswrapper[4959]: E1007 13:19:58.644071 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b\": container with ID starting with e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b not found: ID does not exist" containerID="e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.644134 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b"} err="failed to get container status \"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b\": rpc error: code = NotFound desc = could not find container \"e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b\": container with ID starting with e357205851691ace5191cfc474ae0d2474c5e96123c59147c437a77dc4d8b81b not found: ID does not exist" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.644163 4959 scope.go:117] "RemoveContainer" containerID="451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63" Oct 07 13:19:58 crc kubenswrapper[4959]: E1007 13:19:58.647872 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63\": container with ID starting with 451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63 not found: ID does not exist" containerID="451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.647926 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63"} err="failed to get container status \"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63\": rpc error: code = NotFound desc = could not find container \"451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63\": container with ID starting with 451cb4c4125c6b464aa8c51bdb9f5e02cdf590fc05c82406b195d7c739c19e63 not found: ID does not exist" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.916364 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.925511 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.934821 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:58 crc kubenswrapper[4959]: E1007 13:19:58.935274 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.935301 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" Oct 07 13:19:58 crc kubenswrapper[4959]: E1007 13:19:58.935345 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.935354 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.935535 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-metadata" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.935561 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" containerName="nova-metadata-log" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.936597 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.939024 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.939291 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:19:58 crc kubenswrapper[4959]: I1007 13:19:58.943232 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.035258 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f25xw\" (UniqueName: \"kubernetes.io/projected/627022b9-2219-4bcd-a001-53bf9e863c14-kube-api-access-f25xw\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.035420 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.035463 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627022b9-2219-4bcd-a001-53bf9e863c14-logs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.035501 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.035542 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-config-data\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.137760 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f25xw\" (UniqueName: \"kubernetes.io/projected/627022b9-2219-4bcd-a001-53bf9e863c14-kube-api-access-f25xw\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.137827 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.137849 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627022b9-2219-4bcd-a001-53bf9e863c14-logs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.137879 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.137909 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-config-data\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.139296 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627022b9-2219-4bcd-a001-53bf9e863c14-logs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.141867 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.142115 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.142355 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627022b9-2219-4bcd-a001-53bf9e863c14-config-data\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.159020 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f25xw\" (UniqueName: \"kubernetes.io/projected/627022b9-2219-4bcd-a001-53bf9e863c14-kube-api-access-f25xw\") pod \"nova-metadata-0\" (UID: \"627022b9-2219-4bcd-a001-53bf9e863c14\") " pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.251949 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:19:59 crc kubenswrapper[4959]: I1007 13:19:59.683981 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.277895 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.364731 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle\") pod \"e3271361-342b-4173-a261-884f63259112\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.364829 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data\") pod \"e3271361-342b-4173-a261-884f63259112\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.365023 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zwz\" (UniqueName: \"kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz\") pod \"e3271361-342b-4173-a261-884f63259112\" (UID: \"e3271361-342b-4173-a261-884f63259112\") " Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.370808 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz" (OuterVolumeSpecName: "kube-api-access-s5zwz") pod "e3271361-342b-4173-a261-884f63259112" (UID: "e3271361-342b-4173-a261-884f63259112"). InnerVolumeSpecName "kube-api-access-s5zwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.389305 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3271361-342b-4173-a261-884f63259112" (UID: "e3271361-342b-4173-a261-884f63259112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.400790 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data" (OuterVolumeSpecName: "config-data") pod "e3271361-342b-4173-a261-884f63259112" (UID: "e3271361-342b-4173-a261-884f63259112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.467325 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.467360 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3271361-342b-4173-a261-884f63259112-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.467374 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zwz\" (UniqueName: \"kubernetes.io/projected/e3271361-342b-4173-a261-884f63259112-kube-api-access-s5zwz\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.569141 4959 generic.go:334] "Generic (PLEG): container finished" podID="e3271361-342b-4173-a261-884f63259112" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" exitCode=0 Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.569192 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3271361-342b-4173-a261-884f63259112","Type":"ContainerDied","Data":"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25"} Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.569208 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.569246 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3271361-342b-4173-a261-884f63259112","Type":"ContainerDied","Data":"311f8dd52c688a1a6577478d6529e5bd72588dc330750b07dbbd4ce8cdcbf617"} Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.569272 4959 scope.go:117] "RemoveContainer" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.570773 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"627022b9-2219-4bcd-a001-53bf9e863c14","Type":"ContainerStarted","Data":"58325e77777467558c6d3cac49d49289871a7f6e8e4817018a202f8f5d2a2b52"} Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.570795 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"627022b9-2219-4bcd-a001-53bf9e863c14","Type":"ContainerStarted","Data":"8f4015271bbfe07e7ec7dc4ea9eb0a05912390638536c9de859802332f0f6bc8"} Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.570806 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"627022b9-2219-4bcd-a001-53bf9e863c14","Type":"ContainerStarted","Data":"2a92772dff525af7c49542ec69011952a76693441026054e8f54064b5dcd002f"} Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.585527 4959 scope.go:117] "RemoveContainer" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" Oct 07 13:20:00 crc kubenswrapper[4959]: E1007 13:20:00.585878 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25\": container with ID starting with 61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25 not found: ID does not exist" containerID="61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.585955 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25"} err="failed to get container status \"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25\": rpc error: code = NotFound desc = could not find container \"61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25\": container with ID starting with 61a97229a7e9956430081a2a08aa64ab409e96b682b54607118b9698c2c2dd25 not found: ID does not exist" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.610363 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.610346435 podStartE2EDuration="2.610346435s" podCreationTimestamp="2025-10-07 13:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:20:00.592005878 +0000 UTC m=+1152.752728575" watchObservedRunningTime="2025-10-07 13:20:00.610346435 +0000 UTC m=+1152.771069102" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.622126 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.635750 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.653132 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:20:00 crc kubenswrapper[4959]: E1007 13:20:00.656080 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3271361-342b-4173-a261-884f63259112" containerName="nova-scheduler-scheduler" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.656104 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3271361-342b-4173-a261-884f63259112" containerName="nova-scheduler-scheduler" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.656318 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3271361-342b-4173-a261-884f63259112" containerName="nova-scheduler-scheduler" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.656958 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.664725 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.682220 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.774018 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdq4\" (UniqueName: \"kubernetes.io/projected/a46a9782-e96e-432c-b2e8-c7863291485e-kube-api-access-lsdq4\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.774246 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.774306 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-config-data\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.818536 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3271361-342b-4173-a261-884f63259112" path="/var/lib/kubelet/pods/e3271361-342b-4173-a261-884f63259112/volumes" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.819271 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed103adc-f3b4-4fc5-8f29-ca6a511eab7b" path="/var/lib/kubelet/pods/ed103adc-f3b4-4fc5-8f29-ca6a511eab7b/volumes" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.876432 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdq4\" (UniqueName: \"kubernetes.io/projected/a46a9782-e96e-432c-b2e8-c7863291485e-kube-api-access-lsdq4\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.876490 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.876522 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-config-data\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.882084 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.882178 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46a9782-e96e-432c-b2e8-c7863291485e-config-data\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.902263 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdq4\" (UniqueName: \"kubernetes.io/projected/a46a9782-e96e-432c-b2e8-c7863291485e-kube-api-access-lsdq4\") pod \"nova-scheduler-0\" (UID: \"a46a9782-e96e-432c-b2e8-c7863291485e\") " pod="openstack/nova-scheduler-0" Oct 07 13:20:00 crc kubenswrapper[4959]: I1007 13:20:00.981509 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:20:01 crc kubenswrapper[4959]: I1007 13:20:01.402234 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:20:01 crc kubenswrapper[4959]: I1007 13:20:01.582578 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a46a9782-e96e-432c-b2e8-c7863291485e","Type":"ContainerStarted","Data":"e2a5405546b76d8790a28ab3d333423491c524fb6a510b1d902fc3302350907c"} Oct 07 13:20:02 crc kubenswrapper[4959]: I1007 13:20:02.591098 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a46a9782-e96e-432c-b2e8-c7863291485e","Type":"ContainerStarted","Data":"02947e0ce451c7a74729e8b377d497a7afe5d11c77602421a132b93540657bcc"} Oct 07 13:20:02 crc kubenswrapper[4959]: I1007 13:20:02.613588 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.613563032 podStartE2EDuration="2.613563032s" podCreationTimestamp="2025-10-07 13:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:20:02.611947435 +0000 UTC m=+1154.772670172" watchObservedRunningTime="2025-10-07 13:20:02.613563032 +0000 UTC m=+1154.774285749" Oct 07 13:20:04 crc kubenswrapper[4959]: I1007 13:20:04.253025 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:20:04 crc kubenswrapper[4959]: I1007 13:20:04.253062 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:20:05 crc kubenswrapper[4959]: I1007 13:20:05.981994 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:20:06 crc kubenswrapper[4959]: I1007 13:20:06.875506 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:20:06 crc kubenswrapper[4959]: I1007 13:20:06.875810 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:20:07 crc kubenswrapper[4959]: I1007 13:20:07.696256 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:20:07 crc kubenswrapper[4959]: I1007 13:20:07.696330 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:20:07 crc kubenswrapper[4959]: I1007 13:20:07.889804 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.193:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:20:07 crc kubenswrapper[4959]: I1007 13:20:07.889883 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.193:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:20:09 crc kubenswrapper[4959]: I1007 13:20:09.252223 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:20:09 crc kubenswrapper[4959]: I1007 13:20:09.252649 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:20:10 crc kubenswrapper[4959]: I1007 13:20:10.265808 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="627022b9-2219-4bcd-a001-53bf9e863c14" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:20:10 crc kubenswrapper[4959]: I1007 13:20:10.265849 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="627022b9-2219-4bcd-a001-53bf9e863c14" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 13:20:10 crc kubenswrapper[4959]: I1007 13:20:10.981798 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 13:20:11 crc kubenswrapper[4959]: I1007 13:20:11.008986 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 13:20:11 crc kubenswrapper[4959]: I1007 13:20:11.686812 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 13:20:13 crc kubenswrapper[4959]: I1007 13:20:13.660776 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 13:20:16 crc kubenswrapper[4959]: I1007 13:20:16.882290 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:20:16 crc kubenswrapper[4959]: I1007 13:20:16.883264 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:20:16 crc kubenswrapper[4959]: I1007 13:20:16.883321 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:20:16 crc kubenswrapper[4959]: I1007 13:20:16.890881 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:20:17 crc kubenswrapper[4959]: I1007 13:20:17.709838 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:20:17 crc kubenswrapper[4959]: I1007 13:20:17.718820 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:20:19 crc kubenswrapper[4959]: I1007 13:20:19.257152 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:20:19 crc kubenswrapper[4959]: I1007 13:20:19.258865 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:20:19 crc kubenswrapper[4959]: I1007 13:20:19.262573 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:20:19 crc kubenswrapper[4959]: I1007 13:20:19.732299 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:20:27 crc kubenswrapper[4959]: I1007 13:20:27.571877 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:28 crc kubenswrapper[4959]: I1007 13:20:28.387217 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:31 crc kubenswrapper[4959]: I1007 13:20:31.762922 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="rabbitmq" containerID="cri-o://94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648" gracePeriod=604796 Oct 07 13:20:32 crc kubenswrapper[4959]: I1007 13:20:32.071287 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="rabbitmq" containerID="cri-o://5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554" gracePeriod=604797 Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.482905 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.695509 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.695558 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.695598 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.696119 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.696173 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc" gracePeriod=600 Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.808410 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.896882 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc" exitCode=0 Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.896924 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc"} Oct 07 13:20:37 crc kubenswrapper[4959]: I1007 13:20:37.896958 4959 scope.go:117] "RemoveContainer" containerID="bf0d8a96d5046ea44da887dd65609025728fc1479fe4b34e19d62ea3b31f2ff1" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.340146 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.468752 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.468794 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.468858 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.468919 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.468939 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzjr8\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469006 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469023 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469090 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469110 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469151 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.469176 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret\") pod \"8703d817-5027-4394-a52d-a895f7e0fd10\" (UID: \"8703d817-5027-4394-a52d-a895f7e0fd10\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.470196 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.471215 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.471291 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.476484 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.476915 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.478784 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8" (OuterVolumeSpecName: "kube-api-access-vzjr8") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "kube-api-access-vzjr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.479738 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.479892 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info" (OuterVolumeSpecName: "pod-info") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.498599 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data" (OuterVolumeSpecName: "config-data") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.552141 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf" (OuterVolumeSpecName: "server-conf") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571751 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571790 4959 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8703d817-5027-4394-a52d-a895f7e0fd10-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571805 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzjr8\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-kube-api-access-vzjr8\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571818 4959 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571851 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571863 4959 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571874 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571886 4959 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8703d817-5027-4394-a52d-a895f7e0fd10-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571897 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.571912 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8703d817-5027-4394-a52d-a895f7e0fd10-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.588712 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8703d817-5027-4394-a52d-a895f7e0fd10" (UID: "8703d817-5027-4394-a52d-a895f7e0fd10"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.606261 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.609436 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.672781 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.672868 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.672894 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.672918 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.672974 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673008 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673055 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673084 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673103 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673142 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673264 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins\") pod \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\" (UID: \"0a23cdde-db3b-403e-8c39-1ed3b6c6c808\") " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673385 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673705 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8703d817-5027-4394-a52d-a895f7e0fd10-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673739 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.673748 4959 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.674123 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.679004 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs" (OuterVolumeSpecName: "kube-api-access-r8xqs") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "kube-api-access-r8xqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.679415 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.679746 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.682277 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.684217 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.684322 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.699220 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data" (OuterVolumeSpecName: "config-data") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.730772 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775313 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775346 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-kube-api-access-r8xqs\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775356 4959 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775364 4959 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775373 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775402 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775411 4959 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775420 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.775429 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.778887 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a23cdde-db3b-403e-8c39-1ed3b6c6c808" (UID: "0a23cdde-db3b-403e-8c39-1ed3b6c6c808"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.795274 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.877590 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.877645 4959 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a23cdde-db3b-403e-8c39-1ed3b6c6c808-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.907972 4959 generic.go:334] "Generic (PLEG): container finished" podID="8703d817-5027-4394-a52d-a895f7e0fd10" containerID="94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648" exitCode=0 Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.908006 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerDied","Data":"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648"} Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.908065 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.908071 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8703d817-5027-4394-a52d-a895f7e0fd10","Type":"ContainerDied","Data":"b9b277bd9b1c3beea429c40e125fcf1770c814f29e6c219aa891fbc2161905d5"} Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.908084 4959 scope.go:117] "RemoveContainer" containerID="94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.910701 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d"} Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.913462 4959 generic.go:334] "Generic (PLEG): container finished" podID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerID="5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554" exitCode=0 Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.913507 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.913542 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerDied","Data":"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554"} Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.913580 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a23cdde-db3b-403e-8c39-1ed3b6c6c808","Type":"ContainerDied","Data":"dbe8f26224cd69bad42fc282f078b43dd9ba8322e47e043164f2628649ff015e"} Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.939092 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.943110 4959 scope.go:117] "RemoveContainer" containerID="8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.957893 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.992890 4959 scope.go:117] "RemoveContainer" containerID="94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648" Oct 07 13:20:38 crc kubenswrapper[4959]: E1007 13:20:38.993295 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648\": container with ID starting with 94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648 not found: ID does not exist" containerID="94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.993327 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648"} err="failed to get container status \"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648\": rpc error: code = NotFound desc = could not find container \"94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648\": container with ID starting with 94398584604fa2eb125ee077b98d5ff2a7e3068818a2f226ff3bdb87d4816648 not found: ID does not exist" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.993348 4959 scope.go:117] "RemoveContainer" containerID="8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd" Oct 07 13:20:38 crc kubenswrapper[4959]: E1007 13:20:38.994255 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd\": container with ID starting with 8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd not found: ID does not exist" containerID="8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.994310 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd"} err="failed to get container status \"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd\": rpc error: code = NotFound desc = could not find container \"8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd\": container with ID starting with 8945d7a6d41134ad1c2c9684681500d261324136b9ae03d1b2e1829df8f840fd not found: ID does not exist" Oct 07 13:20:38 crc kubenswrapper[4959]: I1007 13:20:38.994342 4959 scope.go:117] "RemoveContainer" containerID="5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.008760 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.009158 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="setup-container" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009179 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="setup-container" Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.009203 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="setup-container" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009212 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="setup-container" Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.009230 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009239 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.009261 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009268 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009464 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.009481 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" containerName="rabbitmq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.010441 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.015469 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.015479 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016168 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d9h7w" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016284 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016375 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016304 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016532 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.016961 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.020579 4959 scope.go:117] "RemoveContainer" containerID="fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.040318 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.040375 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.040387 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.042583 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048020 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048203 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048223 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048412 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048495 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fc4nq" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.048707 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.052017 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.056973 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.070834 4959 scope.go:117] "RemoveContainer" containerID="5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554" Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.076055 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554\": container with ID starting with 5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554 not found: ID does not exist" containerID="5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.076195 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554"} err="failed to get container status \"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554\": rpc error: code = NotFound desc = could not find container \"5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554\": container with ID starting with 5e3e36c88658fd1491acfb39161251ba9d3ef0d94f6c15c69746a594c5892554 not found: ID does not exist" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.076298 4959 scope.go:117] "RemoveContainer" containerID="fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080166 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n652s\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-kube-api-access-n652s\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080200 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080259 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080285 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-config-data\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080305 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080323 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080342 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52260e60-f3cc-46d0-b7ce-0424500d0573-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080368 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080388 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080415 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52260e60-f3cc-46d0-b7ce-0424500d0573-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.080435 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: E1007 13:20:39.083135 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6\": container with ID starting with fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6 not found: ID does not exist" containerID="fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.083188 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6"} err="failed to get container status \"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6\": rpc error: code = NotFound desc = could not find container \"fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6\": container with ID starting with fb2aba07519c7cd7c29a95e3be7262b8714dce4426d48bc75c6b2c1299ad52f6 not found: ID does not exist" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.182909 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n652s\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-kube-api-access-n652s\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.182985 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183039 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183065 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183109 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-config-data\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183151 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183171 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183196 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183220 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52260e60-f3cc-46d0-b7ce-0424500d0573-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183261 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcn7\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-kube-api-access-slcn7\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183280 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183301 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183323 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183340 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183438 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183598 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183740 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184246 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-config-data\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184596 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.183507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52260e60-f3cc-46d0-b7ce-0424500d0573-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184837 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184874 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184900 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.184949 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.185127 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.185189 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.185516 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52260e60-f3cc-46d0-b7ce-0424500d0573-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.188842 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.188985 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.191438 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52260e60-f3cc-46d0-b7ce-0424500d0573-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.198080 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52260e60-f3cc-46d0-b7ce-0424500d0573-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.202608 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n652s\" (UniqueName: \"kubernetes.io/projected/52260e60-f3cc-46d0-b7ce-0424500d0573-kube-api-access-n652s\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.215222 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"52260e60-f3cc-46d0-b7ce-0424500d0573\") " pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.286775 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287134 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287165 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287213 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slcn7\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-kube-api-access-slcn7\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287237 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287265 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287294 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287327 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287365 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287385 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287415 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.287586 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.288151 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.288241 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.288377 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.288467 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.288522 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.291918 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.293064 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.297533 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.301196 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.303535 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcn7\" (UniqueName: \"kubernetes.io/projected/bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1-kube-api-access-slcn7\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.319312 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.349918 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.403226 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.794250 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.873904 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:20:39 crc kubenswrapper[4959]: W1007 13:20:39.880685 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd08ab8c_e65e_4ca6_8cd3_a62bec086bb1.slice/crio-74d7cde223b481bc02ed247f928cbd92fed4586c5949c1df55ce7e0796554874 WatchSource:0}: Error finding container 74d7cde223b481bc02ed247f928cbd92fed4586c5949c1df55ce7e0796554874: Status 404 returned error can't find the container with id 74d7cde223b481bc02ed247f928cbd92fed4586c5949c1df55ce7e0796554874 Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.932967 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52260e60-f3cc-46d0-b7ce-0424500d0573","Type":"ContainerStarted","Data":"25e19c69cb0e2996a6955e059c955d0fbe0141ae6e82f99c6b7bf7e5ec5a4bef"} Oct 07 13:20:39 crc kubenswrapper[4959]: I1007 13:20:39.939578 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1","Type":"ContainerStarted","Data":"74d7cde223b481bc02ed247f928cbd92fed4586c5949c1df55ce7e0796554874"} Oct 07 13:20:40 crc kubenswrapper[4959]: I1007 13:20:40.820306 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a23cdde-db3b-403e-8c39-1ed3b6c6c808" path="/var/lib/kubelet/pods/0a23cdde-db3b-403e-8c39-1ed3b6c6c808/volumes" Oct 07 13:20:40 crc kubenswrapper[4959]: I1007 13:20:40.821890 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8703d817-5027-4394-a52d-a895f7e0fd10" path="/var/lib/kubelet/pods/8703d817-5027-4394-a52d-a895f7e0fd10/volumes" Oct 07 13:20:41 crc kubenswrapper[4959]: I1007 13:20:41.968049 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52260e60-f3cc-46d0-b7ce-0424500d0573","Type":"ContainerStarted","Data":"88202865536c5fb1b1e0c808d3bed657861aa8a258091176a830ec526a23d404"} Oct 07 13:20:41 crc kubenswrapper[4959]: I1007 13:20:41.970369 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1","Type":"ContainerStarted","Data":"4cb8d33260ac0edeb4e539ae15c7fc394e8874793fdc0ce2ef080edf57476b9a"} Oct 07 13:20:42 crc kubenswrapper[4959]: I1007 13:20:42.950652 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:20:42 crc kubenswrapper[4959]: I1007 13:20:42.952352 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:42 crc kubenswrapper[4959]: I1007 13:20:42.954072 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 07 13:20:42 crc kubenswrapper[4959]: I1007 13:20:42.994383 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.071034 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.071121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.071962 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.072426 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.072925 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhwz6\" (UniqueName: \"kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.073001 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174542 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174615 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174691 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174756 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhwz6\" (UniqueName: \"kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174778 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.174824 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.175606 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.175814 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.175903 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.176130 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.176267 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.198387 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhwz6\" (UniqueName: \"kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6\") pod \"dnsmasq-dns-5745cbd8d7-zh2dj\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.273484 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.701174 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:20:43 crc kubenswrapper[4959]: I1007 13:20:43.988194 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" event={"ID":"159e0fc4-06dd-463c-88a8-6398022da1bb","Type":"ContainerStarted","Data":"f1aac4122d9ce1d18d45d98abf97217e5091abb4b4c0e852eb854f19a56fc857"} Oct 07 13:20:44 crc kubenswrapper[4959]: I1007 13:20:44.996663 4959 generic.go:334] "Generic (PLEG): container finished" podID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerID="2af5dc2e6796b2c22a3e1047d82f983163cc18ea65a5e8b6f5b5def65cfb166d" exitCode=0 Oct 07 13:20:44 crc kubenswrapper[4959]: I1007 13:20:44.996734 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" event={"ID":"159e0fc4-06dd-463c-88a8-6398022da1bb","Type":"ContainerDied","Data":"2af5dc2e6796b2c22a3e1047d82f983163cc18ea65a5e8b6f5b5def65cfb166d"} Oct 07 13:20:46 crc kubenswrapper[4959]: I1007 13:20:46.006034 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" event={"ID":"159e0fc4-06dd-463c-88a8-6398022da1bb","Type":"ContainerStarted","Data":"6bec252f01860587dc575a78079021832fdffd18f1f4587492de4e07d7916b7f"} Oct 07 13:20:46 crc kubenswrapper[4959]: I1007 13:20:46.007176 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:46 crc kubenswrapper[4959]: I1007 13:20:46.030239 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" podStartSLOduration=4.030220353 podStartE2EDuration="4.030220353s" podCreationTimestamp="2025-10-07 13:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:20:46.021035747 +0000 UTC m=+1198.181758434" watchObservedRunningTime="2025-10-07 13:20:46.030220353 +0000 UTC m=+1198.190943030" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.275829 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.320460 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.320708 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869677f947-z6bhx" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="dnsmasq-dns" containerID="cri-o://26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553" gracePeriod=10 Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.496906 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.498774 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.524983 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567379 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567835 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567860 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567881 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567905 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.567978 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvngv\" (UniqueName: \"kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.669970 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.670032 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.670055 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.670074 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.670145 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvngv\" (UniqueName: \"kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.670177 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.671088 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.671407 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.671836 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.672074 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.672306 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.714040 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvngv\" (UniqueName: \"kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv\") pod \"dnsmasq-dns-5f5d87575-rj7fd\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.814899 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.946460 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.980370 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb\") pod \"25aebe9e-8937-40d0-bb85-e057e6b79778\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.980434 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb\") pod \"25aebe9e-8937-40d0-bb85-e057e6b79778\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.980511 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc\") pod \"25aebe9e-8937-40d0-bb85-e057e6b79778\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.980568 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6hwx\" (UniqueName: \"kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx\") pod \"25aebe9e-8937-40d0-bb85-e057e6b79778\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " Oct 07 13:20:53 crc kubenswrapper[4959]: I1007 13:20:53.980727 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config\") pod \"25aebe9e-8937-40d0-bb85-e057e6b79778\" (UID: \"25aebe9e-8937-40d0-bb85-e057e6b79778\") " Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.003346 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx" (OuterVolumeSpecName: "kube-api-access-z6hwx") pod "25aebe9e-8937-40d0-bb85-e057e6b79778" (UID: "25aebe9e-8937-40d0-bb85-e057e6b79778"). InnerVolumeSpecName "kube-api-access-z6hwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.044870 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25aebe9e-8937-40d0-bb85-e057e6b79778" (UID: "25aebe9e-8937-40d0-bb85-e057e6b79778"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.059139 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config" (OuterVolumeSpecName: "config") pod "25aebe9e-8937-40d0-bb85-e057e6b79778" (UID: "25aebe9e-8937-40d0-bb85-e057e6b79778"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.064894 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25aebe9e-8937-40d0-bb85-e057e6b79778" (UID: "25aebe9e-8937-40d0-bb85-e057e6b79778"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.070497 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25aebe9e-8937-40d0-bb85-e057e6b79778" (UID: "25aebe9e-8937-40d0-bb85-e057e6b79778"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.073655 4959 generic.go:334] "Generic (PLEG): container finished" podID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerID="26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553" exitCode=0 Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.073709 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869677f947-z6bhx" event={"ID":"25aebe9e-8937-40d0-bb85-e057e6b79778","Type":"ContainerDied","Data":"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553"} Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.073742 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869677f947-z6bhx" event={"ID":"25aebe9e-8937-40d0-bb85-e057e6b79778","Type":"ContainerDied","Data":"b562473c1dbec9c2a8cf9073e41a7ff2d61be84b2a4a98c67505bc3734777197"} Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.073737 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869677f947-z6bhx" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.073758 4959 scope.go:117] "RemoveContainer" containerID="26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.082344 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.082378 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6hwx\" (UniqueName: \"kubernetes.io/projected/25aebe9e-8937-40d0-bb85-e057e6b79778-kube-api-access-z6hwx\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.082389 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.082399 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.082412 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25aebe9e-8937-40d0-bb85-e057e6b79778-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.098974 4959 scope.go:117] "RemoveContainer" containerID="97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.122834 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.130160 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869677f947-z6bhx"] Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.132003 4959 scope.go:117] "RemoveContainer" containerID="26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553" Oct 07 13:20:54 crc kubenswrapper[4959]: E1007 13:20:54.132457 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553\": container with ID starting with 26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553 not found: ID does not exist" containerID="26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.132513 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553"} err="failed to get container status \"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553\": rpc error: code = NotFound desc = could not find container \"26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553\": container with ID starting with 26bf4cf29623f55f42024e2005195d2f88a7dc0d25e96e9b982d12de5ef94553 not found: ID does not exist" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.132547 4959 scope.go:117] "RemoveContainer" containerID="97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf" Oct 07 13:20:54 crc kubenswrapper[4959]: E1007 13:20:54.133700 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf\": container with ID starting with 97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf not found: ID does not exist" containerID="97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.133731 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf"} err="failed to get container status \"97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf\": rpc error: code = NotFound desc = could not find container \"97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf\": container with ID starting with 97bc9060dc53eb2d98cee4da4c1563beda0874ef8e4a3c16612427eaa6003cbf not found: ID does not exist" Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.265575 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:20:54 crc kubenswrapper[4959]: W1007 13:20:54.271359 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849a94c5_f31f_4068_b47c_fb1163b6afc0.slice/crio-5468eec3d989422174f56a4a6b96cf2daabe7eab56b4c67367ce6c4bd49d1364 WatchSource:0}: Error finding container 5468eec3d989422174f56a4a6b96cf2daabe7eab56b4c67367ce6c4bd49d1364: Status 404 returned error can't find the container with id 5468eec3d989422174f56a4a6b96cf2daabe7eab56b4c67367ce6c4bd49d1364 Oct 07 13:20:54 crc kubenswrapper[4959]: I1007 13:20:54.823008 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" path="/var/lib/kubelet/pods/25aebe9e-8937-40d0-bb85-e057e6b79778/volumes" Oct 07 13:20:55 crc kubenswrapper[4959]: I1007 13:20:55.081772 4959 generic.go:334] "Generic (PLEG): container finished" podID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerID="a3302a832e96814f3ca61c24a777c45b99e45237ec86c4511054e9d06b5ada4c" exitCode=0 Oct 07 13:20:55 crc kubenswrapper[4959]: I1007 13:20:55.081856 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" event={"ID":"849a94c5-f31f-4068-b47c-fb1163b6afc0","Type":"ContainerDied","Data":"a3302a832e96814f3ca61c24a777c45b99e45237ec86c4511054e9d06b5ada4c"} Oct 07 13:20:55 crc kubenswrapper[4959]: I1007 13:20:55.081880 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" event={"ID":"849a94c5-f31f-4068-b47c-fb1163b6afc0","Type":"ContainerStarted","Data":"5468eec3d989422174f56a4a6b96cf2daabe7eab56b4c67367ce6c4bd49d1364"} Oct 07 13:20:56 crc kubenswrapper[4959]: I1007 13:20:56.092735 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" event={"ID":"849a94c5-f31f-4068-b47c-fb1163b6afc0","Type":"ContainerStarted","Data":"bdfdfcce1c2bf9bcbf7c1fdeafa9b7a419064d1e2802f8324574900e53af9f42"} Oct 07 13:20:56 crc kubenswrapper[4959]: I1007 13:20:56.093049 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:20:56 crc kubenswrapper[4959]: I1007 13:20:56.121259 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" podStartSLOduration=3.121240862 podStartE2EDuration="3.121240862s" podCreationTimestamp="2025-10-07 13:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:20:56.111310083 +0000 UTC m=+1208.272032760" watchObservedRunningTime="2025-10-07 13:20:56.121240862 +0000 UTC m=+1208.281963539" Oct 07 13:21:03 crc kubenswrapper[4959]: I1007 13:21:03.816670 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:21:03 crc kubenswrapper[4959]: I1007 13:21:03.877073 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:21:03 crc kubenswrapper[4959]: I1007 13:21:03.877331 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="dnsmasq-dns" containerID="cri-o://6bec252f01860587dc575a78079021832fdffd18f1f4587492de4e07d7916b7f" gracePeriod=10 Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.156662 4959 generic.go:334] "Generic (PLEG): container finished" podID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerID="6bec252f01860587dc575a78079021832fdffd18f1f4587492de4e07d7916b7f" exitCode=0 Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.156734 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" event={"ID":"159e0fc4-06dd-463c-88a8-6398022da1bb","Type":"ContainerDied","Data":"6bec252f01860587dc575a78079021832fdffd18f1f4587492de4e07d7916b7f"} Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.371253 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.476893 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.476983 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.477071 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.477108 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.477157 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.477196 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhwz6\" (UniqueName: \"kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6\") pod \"159e0fc4-06dd-463c-88a8-6398022da1bb\" (UID: \"159e0fc4-06dd-463c-88a8-6398022da1bb\") " Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.485250 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6" (OuterVolumeSpecName: "kube-api-access-vhwz6") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "kube-api-access-vhwz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.559996 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.579215 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.579252 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhwz6\" (UniqueName: \"kubernetes.io/projected/159e0fc4-06dd-463c-88a8-6398022da1bb-kube-api-access-vhwz6\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.589133 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.592212 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.601113 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.606089 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config" (OuterVolumeSpecName: "config") pod "159e0fc4-06dd-463c-88a8-6398022da1bb" (UID: "159e0fc4-06dd-463c-88a8-6398022da1bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.680003 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.680038 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.680050 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:04 crc kubenswrapper[4959]: I1007 13:21:04.680059 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/159e0fc4-06dd-463c-88a8-6398022da1bb-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.165218 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" event={"ID":"159e0fc4-06dd-463c-88a8-6398022da1bb","Type":"ContainerDied","Data":"f1aac4122d9ce1d18d45d98abf97217e5091abb4b4c0e852eb854f19a56fc857"} Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.165257 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5745cbd8d7-zh2dj" Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.165530 4959 scope.go:117] "RemoveContainer" containerID="6bec252f01860587dc575a78079021832fdffd18f1f4587492de4e07d7916b7f" Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.187305 4959 scope.go:117] "RemoveContainer" containerID="2af5dc2e6796b2c22a3e1047d82f983163cc18ea65a5e8b6f5b5def65cfb166d" Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.191521 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:21:05 crc kubenswrapper[4959]: I1007 13:21:05.198835 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5745cbd8d7-zh2dj"] Oct 07 13:21:06 crc kubenswrapper[4959]: I1007 13:21:06.825576 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" path="/var/lib/kubelet/pods/159e0fc4-06dd-463c-88a8-6398022da1bb/volumes" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.136444 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g"] Oct 07 13:21:14 crc kubenswrapper[4959]: E1007 13:21:14.142880 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.142929 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: E1007 13:21:14.142961 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.142975 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: E1007 13:21:14.142990 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="init" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.143001 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="init" Oct 07 13:21:14 crc kubenswrapper[4959]: E1007 13:21:14.143033 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="init" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.143043 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="init" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.143513 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="159e0fc4-06dd-463c-88a8-6398022da1bb" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.143538 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="25aebe9e-8937-40d0-bb85-e057e6b79778" containerName="dnsmasq-dns" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.144587 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.147356 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.147756 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.149679 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.149703 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.163205 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g"] Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.252275 4959 generic.go:334] "Generic (PLEG): container finished" podID="52260e60-f3cc-46d0-b7ce-0424500d0573" containerID="88202865536c5fb1b1e0c808d3bed657861aa8a258091176a830ec526a23d404" exitCode=0 Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.252331 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52260e60-f3cc-46d0-b7ce-0424500d0573","Type":"ContainerDied","Data":"88202865536c5fb1b1e0c808d3bed657861aa8a258091176a830ec526a23d404"} Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.258060 4959 generic.go:334] "Generic (PLEG): container finished" podID="bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1" containerID="4cb8d33260ac0edeb4e539ae15c7fc394e8874793fdc0ce2ef080edf57476b9a" exitCode=0 Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.258120 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1","Type":"ContainerDied","Data":"4cb8d33260ac0edeb4e539ae15c7fc394e8874793fdc0ce2ef080edf57476b9a"} Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.338721 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc5jh\" (UniqueName: \"kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.338783 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.338830 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.338870 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.440736 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc5jh\" (UniqueName: \"kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.440951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.441093 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.441196 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.447343 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.447687 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.450711 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.460948 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc5jh\" (UniqueName: \"kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:14 crc kubenswrapper[4959]: I1007 13:21:14.470392 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.048234 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g"] Oct 07 13:21:15 crc kubenswrapper[4959]: W1007 13:21:15.054416 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e8bfe37_7a34_43c8_9a53_d0a03f45b382.slice/crio-283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497 WatchSource:0}: Error finding container 283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497: Status 404 returned error can't find the container with id 283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497 Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.056943 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.268706 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" event={"ID":"6e8bfe37-7a34-43c8-9a53-d0a03f45b382","Type":"ContainerStarted","Data":"283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497"} Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.270885 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52260e60-f3cc-46d0-b7ce-0424500d0573","Type":"ContainerStarted","Data":"61a2d86101e1c24d66189bb9484c88b364c0e83318349c5696162a93f9617294"} Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.271997 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.274497 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1","Type":"ContainerStarted","Data":"cf16720576b6e8db5268911dde5e04c0135770d394353bc1aecc20a4e7b68435"} Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.274759 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.305163 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.305132819 podStartE2EDuration="37.305132819s" podCreationTimestamp="2025-10-07 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:21:15.297209319 +0000 UTC m=+1227.457932036" watchObservedRunningTime="2025-10-07 13:21:15.305132819 +0000 UTC m=+1227.465855496" Oct 07 13:21:15 crc kubenswrapper[4959]: I1007 13:21:15.326364 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.326343625 podStartE2EDuration="37.326343625s" podCreationTimestamp="2025-10-07 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:21:15.322485973 +0000 UTC m=+1227.483208670" watchObservedRunningTime="2025-10-07 13:21:15.326343625 +0000 UTC m=+1227.487066312" Oct 07 13:21:24 crc kubenswrapper[4959]: I1007 13:21:24.354729 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" event={"ID":"6e8bfe37-7a34-43c8-9a53-d0a03f45b382","Type":"ContainerStarted","Data":"9793afd8210076f79a6f5af7970531b47b268d36e0fa1cd2852fb0dbe205a26f"} Oct 07 13:21:24 crc kubenswrapper[4959]: I1007 13:21:24.377055 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" podStartSLOduration=2.280224467 podStartE2EDuration="10.37703681s" podCreationTimestamp="2025-10-07 13:21:14 +0000 UTC" firstStartedPulling="2025-10-07 13:21:15.056610719 +0000 UTC m=+1227.217333396" lastFinishedPulling="2025-10-07 13:21:23.153423062 +0000 UTC m=+1235.314145739" observedRunningTime="2025-10-07 13:21:24.373716543 +0000 UTC m=+1236.534439240" watchObservedRunningTime="2025-10-07 13:21:24.37703681 +0000 UTC m=+1236.537759487" Oct 07 13:21:29 crc kubenswrapper[4959]: I1007 13:21:29.353029 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 13:21:29 crc kubenswrapper[4959]: I1007 13:21:29.407068 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:21:35 crc kubenswrapper[4959]: I1007 13:21:35.467091 4959 generic.go:334] "Generic (PLEG): container finished" podID="6e8bfe37-7a34-43c8-9a53-d0a03f45b382" containerID="9793afd8210076f79a6f5af7970531b47b268d36e0fa1cd2852fb0dbe205a26f" exitCode=0 Oct 07 13:21:35 crc kubenswrapper[4959]: I1007 13:21:35.467184 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" event={"ID":"6e8bfe37-7a34-43c8-9a53-d0a03f45b382","Type":"ContainerDied","Data":"9793afd8210076f79a6f5af7970531b47b268d36e0fa1cd2852fb0dbe205a26f"} Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.847007 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.981899 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key\") pod \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.981968 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory\") pod \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.981995 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle\") pod \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.982033 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc5jh\" (UniqueName: \"kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh\") pod \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\" (UID: \"6e8bfe37-7a34-43c8-9a53-d0a03f45b382\") " Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.986739 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6e8bfe37-7a34-43c8-9a53-d0a03f45b382" (UID: "6e8bfe37-7a34-43c8-9a53-d0a03f45b382"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:21:36 crc kubenswrapper[4959]: I1007 13:21:36.987396 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh" (OuterVolumeSpecName: "kube-api-access-sc5jh") pod "6e8bfe37-7a34-43c8-9a53-d0a03f45b382" (UID: "6e8bfe37-7a34-43c8-9a53-d0a03f45b382"). InnerVolumeSpecName "kube-api-access-sc5jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.007132 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6e8bfe37-7a34-43c8-9a53-d0a03f45b382" (UID: "6e8bfe37-7a34-43c8-9a53-d0a03f45b382"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.011375 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory" (OuterVolumeSpecName: "inventory") pod "6e8bfe37-7a34-43c8-9a53-d0a03f45b382" (UID: "6e8bfe37-7a34-43c8-9a53-d0a03f45b382"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.084654 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.084694 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.084715 4959 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.084726 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc5jh\" (UniqueName: \"kubernetes.io/projected/6e8bfe37-7a34-43c8-9a53-d0a03f45b382-kube-api-access-sc5jh\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.498148 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" event={"ID":"6e8bfe37-7a34-43c8-9a53-d0a03f45b382","Type":"ContainerDied","Data":"283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497"} Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.498190 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="283d0be61e476b034d451e9ea18d19c815e24d18c9c6134f37057f5095559497" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.498288 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.559278 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st"] Oct 07 13:21:37 crc kubenswrapper[4959]: E1007 13:21:37.559966 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8bfe37-7a34-43c8-9a53-d0a03f45b382" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.560074 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8bfe37-7a34-43c8-9a53-d0a03f45b382" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.560327 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8bfe37-7a34-43c8-9a53-d0a03f45b382" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.560958 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.563161 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.563311 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.563678 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.564039 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.588356 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st"] Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.696502 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkkwj\" (UniqueName: \"kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.696644 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.696817 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.696874 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.798258 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.798372 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.798466 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.798693 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkkwj\" (UniqueName: \"kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.802776 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.806184 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.809012 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.814971 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkkwj\" (UniqueName: \"kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x94st\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:37 crc kubenswrapper[4959]: I1007 13:21:37.884149 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:21:38 crc kubenswrapper[4959]: I1007 13:21:38.397387 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st"] Oct 07 13:21:38 crc kubenswrapper[4959]: W1007 13:21:38.399046 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5518b9cd_5bcc_480a_95ef_6aa41f1b6745.slice/crio-7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d WatchSource:0}: Error finding container 7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d: Status 404 returned error can't find the container with id 7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d Oct 07 13:21:38 crc kubenswrapper[4959]: I1007 13:21:38.510105 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" event={"ID":"5518b9cd-5bcc-480a-95ef-6aa41f1b6745","Type":"ContainerStarted","Data":"7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d"} Oct 07 13:21:39 crc kubenswrapper[4959]: I1007 13:21:39.523804 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" event={"ID":"5518b9cd-5bcc-480a-95ef-6aa41f1b6745","Type":"ContainerStarted","Data":"318bf4cd9f07d946f6ecdbd31b67b0cd55aed3b1df3b2f5ad6e35b6b1b1ba10f"} Oct 07 13:21:39 crc kubenswrapper[4959]: I1007 13:21:39.551675 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" podStartSLOduration=1.90276638 podStartE2EDuration="2.55161649s" podCreationTimestamp="2025-10-07 13:21:37 +0000 UTC" firstStartedPulling="2025-10-07 13:21:38.401801336 +0000 UTC m=+1250.562524023" lastFinishedPulling="2025-10-07 13:21:39.050651456 +0000 UTC m=+1251.211374133" observedRunningTime="2025-10-07 13:21:39.542576587 +0000 UTC m=+1251.703299274" watchObservedRunningTime="2025-10-07 13:21:39.55161649 +0000 UTC m=+1251.712339177" Oct 07 13:22:51 crc kubenswrapper[4959]: I1007 13:22:51.206296 4959 scope.go:117] "RemoveContainer" containerID="fa70e3ec1034497a65f3a67ffe75ab8b403142fedb0964f38df8563919af4e44" Oct 07 13:22:51 crc kubenswrapper[4959]: I1007 13:22:51.241683 4959 scope.go:117] "RemoveContainer" containerID="bd66846d6a2fe57e5218379116787a21a12ddf9bbbc1d6cce6e5ee567e72e0af" Oct 07 13:22:51 crc kubenswrapper[4959]: I1007 13:22:51.310020 4959 scope.go:117] "RemoveContainer" containerID="7e3b10863cc41b61e8cca163942242905c547c11f4aa3798b7caf5f91aabb710" Oct 07 13:23:07 crc kubenswrapper[4959]: I1007 13:23:07.695724 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:23:07 crc kubenswrapper[4959]: I1007 13:23:07.696184 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:23:37 crc kubenswrapper[4959]: I1007 13:23:37.696166 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:23:37 crc kubenswrapper[4959]: I1007 13:23:37.696651 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:23:51 crc kubenswrapper[4959]: I1007 13:23:51.378202 4959 scope.go:117] "RemoveContainer" containerID="e52db1c46fa22ff6d19858d91ef640ccba2e638180f1a0ed16359462adf62033" Oct 07 13:23:51 crc kubenswrapper[4959]: I1007 13:23:51.408853 4959 scope.go:117] "RemoveContainer" containerID="9261500809d59f98a3afbef16a81008c4c95464b1784353631189e0e1e7dda9d" Oct 07 13:23:51 crc kubenswrapper[4959]: I1007 13:23:51.462103 4959 scope.go:117] "RemoveContainer" containerID="33b11c80ee43caa19ebb078fa9d5df7b1c49895db7d508051650adc20a1e0692" Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.695528 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.696103 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.696148 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.696861 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.696918 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d" gracePeriod=600 Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.924443 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d" exitCode=0 Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.924537 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d"} Oct 07 13:24:07 crc kubenswrapper[4959]: I1007 13:24:07.925053 4959 scope.go:117] "RemoveContainer" containerID="c6df8724a1b950c3c36c1fa6f27f6e7a3f2c184c3d4c9478bac7b1998fa538dc" Oct 07 13:24:08 crc kubenswrapper[4959]: I1007 13:24:08.940055 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a"} Oct 07 13:24:40 crc kubenswrapper[4959]: I1007 13:24:40.893018 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:24:40 crc kubenswrapper[4959]: I1007 13:24:40.898012 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:40 crc kubenswrapper[4959]: I1007 13:24:40.906176 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.070592 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.070672 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgg9\" (UniqueName: \"kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.071107 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.172724 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.172839 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.173344 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.173428 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgg9\" (UniqueName: \"kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.173544 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.197807 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgg9\" (UniqueName: \"kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9\") pod \"certified-operators-wz6xj\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.219355 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:41 crc kubenswrapper[4959]: I1007 13:24:41.744882 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:24:41 crc kubenswrapper[4959]: W1007 13:24:41.757211 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61c1e97_9fcc_4116_aeb6_70ee63d38c3a.slice/crio-5ffea2ad6b16bf8cf358482a232a4fc534a5b766a8a53c6b861803bf7e81495f WatchSource:0}: Error finding container 5ffea2ad6b16bf8cf358482a232a4fc534a5b766a8a53c6b861803bf7e81495f: Status 404 returned error can't find the container with id 5ffea2ad6b16bf8cf358482a232a4fc534a5b766a8a53c6b861803bf7e81495f Oct 07 13:24:42 crc kubenswrapper[4959]: I1007 13:24:42.230012 4959 generic.go:334] "Generic (PLEG): container finished" podID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerID="1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d" exitCode=0 Oct 07 13:24:42 crc kubenswrapper[4959]: I1007 13:24:42.230078 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerDied","Data":"1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d"} Oct 07 13:24:42 crc kubenswrapper[4959]: I1007 13:24:42.230493 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerStarted","Data":"5ffea2ad6b16bf8cf358482a232a4fc534a5b766a8a53c6b861803bf7e81495f"} Oct 07 13:24:43 crc kubenswrapper[4959]: I1007 13:24:43.256223 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerStarted","Data":"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb"} Oct 07 13:24:44 crc kubenswrapper[4959]: I1007 13:24:44.267362 4959 generic.go:334] "Generic (PLEG): container finished" podID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerID="8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb" exitCode=0 Oct 07 13:24:44 crc kubenswrapper[4959]: I1007 13:24:44.267413 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerDied","Data":"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb"} Oct 07 13:24:47 crc kubenswrapper[4959]: I1007 13:24:47.292552 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerStarted","Data":"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b"} Oct 07 13:24:47 crc kubenswrapper[4959]: I1007 13:24:47.316944 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wz6xj" podStartSLOduration=2.7163271460000002 podStartE2EDuration="7.316920465s" podCreationTimestamp="2025-10-07 13:24:40 +0000 UTC" firstStartedPulling="2025-10-07 13:24:42.233602431 +0000 UTC m=+1434.394325108" lastFinishedPulling="2025-10-07 13:24:46.83419576 +0000 UTC m=+1438.994918427" observedRunningTime="2025-10-07 13:24:47.307748308 +0000 UTC m=+1439.468471005" watchObservedRunningTime="2025-10-07 13:24:47.316920465 +0000 UTC m=+1439.477643142" Oct 07 13:24:51 crc kubenswrapper[4959]: I1007 13:24:51.220099 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:51 crc kubenswrapper[4959]: I1007 13:24:51.220584 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:51 crc kubenswrapper[4959]: I1007 13:24:51.261574 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:24:52 crc kubenswrapper[4959]: I1007 13:24:52.334026 4959 generic.go:334] "Generic (PLEG): container finished" podID="5518b9cd-5bcc-480a-95ef-6aa41f1b6745" containerID="318bf4cd9f07d946f6ecdbd31b67b0cd55aed3b1df3b2f5ad6e35b6b1b1ba10f" exitCode=0 Oct 07 13:24:52 crc kubenswrapper[4959]: I1007 13:24:52.334090 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" event={"ID":"5518b9cd-5bcc-480a-95ef-6aa41f1b6745","Type":"ContainerDied","Data":"318bf4cd9f07d946f6ecdbd31b67b0cd55aed3b1df3b2f5ad6e35b6b1b1ba10f"} Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.697158 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.893924 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key\") pod \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.894284 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkkwj\" (UniqueName: \"kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj\") pod \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.894449 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle\") pod \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.894522 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory\") pod \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\" (UID: \"5518b9cd-5bcc-480a-95ef-6aa41f1b6745\") " Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.899426 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj" (OuterVolumeSpecName: "kube-api-access-vkkwj") pod "5518b9cd-5bcc-480a-95ef-6aa41f1b6745" (UID: "5518b9cd-5bcc-480a-95ef-6aa41f1b6745"). InnerVolumeSpecName "kube-api-access-vkkwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.900380 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5518b9cd-5bcc-480a-95ef-6aa41f1b6745" (UID: "5518b9cd-5bcc-480a-95ef-6aa41f1b6745"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.918722 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5518b9cd-5bcc-480a-95ef-6aa41f1b6745" (UID: "5518b9cd-5bcc-480a-95ef-6aa41f1b6745"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.919098 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory" (OuterVolumeSpecName: "inventory") pod "5518b9cd-5bcc-480a-95ef-6aa41f1b6745" (UID: "5518b9cd-5bcc-480a-95ef-6aa41f1b6745"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.996905 4959 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.996936 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.996945 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:53 crc kubenswrapper[4959]: I1007 13:24:53.996953 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkkwj\" (UniqueName: \"kubernetes.io/projected/5518b9cd-5bcc-480a-95ef-6aa41f1b6745-kube-api-access-vkkwj\") on node \"crc\" DevicePath \"\"" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.351664 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" event={"ID":"5518b9cd-5bcc-480a-95ef-6aa41f1b6745","Type":"ContainerDied","Data":"7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d"} Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.351698 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6ff740a9b69885396218aa0863f3d515f59cbe2dd1f6cec31e5a2af259349d" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.351786 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.418061 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8"] Oct 07 13:24:54 crc kubenswrapper[4959]: E1007 13:24:54.418546 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5518b9cd-5bcc-480a-95ef-6aa41f1b6745" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.418572 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5518b9cd-5bcc-480a-95ef-6aa41f1b6745" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.418842 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5518b9cd-5bcc-480a-95ef-6aa41f1b6745" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.419547 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.421759 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.421951 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.422144 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.422487 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.427819 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8"] Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.607387 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.607460 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.607487 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqq5p\" (UniqueName: \"kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.709955 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.710016 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.710040 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqq5p\" (UniqueName: \"kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.716978 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.716988 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.730798 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqq5p\" (UniqueName: \"kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-v85f8\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:54 crc kubenswrapper[4959]: I1007 13:24:54.737097 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:24:55 crc kubenswrapper[4959]: I1007 13:24:55.234694 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8"] Oct 07 13:24:55 crc kubenswrapper[4959]: I1007 13:24:55.360032 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" event={"ID":"36002010-873a-43db-bc92-b37c7eb7bf35","Type":"ContainerStarted","Data":"69249e5ef20298ffd9e382deb1a49940d81626beca4765a5bf1d48553bbec44c"} Oct 07 13:24:56 crc kubenswrapper[4959]: I1007 13:24:56.372786 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" event={"ID":"36002010-873a-43db-bc92-b37c7eb7bf35","Type":"ContainerStarted","Data":"7bc0ba797e4bfb7465f2b0445fe04985722c6d7767d548ecb260f834bf6f8bc3"} Oct 07 13:24:56 crc kubenswrapper[4959]: I1007 13:24:56.405567 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" podStartSLOduration=1.908096408 podStartE2EDuration="2.405549186s" podCreationTimestamp="2025-10-07 13:24:54 +0000 UTC" firstStartedPulling="2025-10-07 13:24:55.243385765 +0000 UTC m=+1447.404108442" lastFinishedPulling="2025-10-07 13:24:55.740838543 +0000 UTC m=+1447.901561220" observedRunningTime="2025-10-07 13:24:56.390718131 +0000 UTC m=+1448.551440898" watchObservedRunningTime="2025-10-07 13:24:56.405549186 +0000 UTC m=+1448.566271863" Oct 07 13:25:01 crc kubenswrapper[4959]: I1007 13:25:01.268311 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:25:01 crc kubenswrapper[4959]: I1007 13:25:01.313556 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:25:01 crc kubenswrapper[4959]: I1007 13:25:01.451243 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wz6xj" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="registry-server" containerID="cri-o://d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b" gracePeriod=2 Oct 07 13:25:01 crc kubenswrapper[4959]: I1007 13:25:01.932660 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.056675 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities\") pod \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.056848 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgg9\" (UniqueName: \"kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9\") pod \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.056888 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content\") pod \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\" (UID: \"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a\") " Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.060723 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities" (OuterVolumeSpecName: "utilities") pod "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" (UID: "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.065504 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9" (OuterVolumeSpecName: "kube-api-access-2rgg9") pod "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" (UID: "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a"). InnerVolumeSpecName "kube-api-access-2rgg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.117574 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" (UID: "b61c1e97-9fcc-4116-aeb6-70ee63d38c3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.158644 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.158677 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgg9\" (UniqueName: \"kubernetes.io/projected/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-kube-api-access-2rgg9\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.158687 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.467511 4959 generic.go:334] "Generic (PLEG): container finished" podID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerID="d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b" exitCode=0 Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.467558 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerDied","Data":"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b"} Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.467587 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz6xj" event={"ID":"b61c1e97-9fcc-4116-aeb6-70ee63d38c3a","Type":"ContainerDied","Data":"5ffea2ad6b16bf8cf358482a232a4fc534a5b766a8a53c6b861803bf7e81495f"} Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.467635 4959 scope.go:117] "RemoveContainer" containerID="d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.469399 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz6xj" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.507011 4959 scope.go:117] "RemoveContainer" containerID="8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.526813 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.534100 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wz6xj"] Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.547152 4959 scope.go:117] "RemoveContainer" containerID="1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.570683 4959 scope.go:117] "RemoveContainer" containerID="d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b" Oct 07 13:25:02 crc kubenswrapper[4959]: E1007 13:25:02.571198 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b\": container with ID starting with d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b not found: ID does not exist" containerID="d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.571241 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b"} err="failed to get container status \"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b\": rpc error: code = NotFound desc = could not find container \"d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b\": container with ID starting with d748f2a301a0f1f9ff869f9e65b2a93797b49c2842bcac7a517260e6b07c625b not found: ID does not exist" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.571270 4959 scope.go:117] "RemoveContainer" containerID="8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb" Oct 07 13:25:02 crc kubenswrapper[4959]: E1007 13:25:02.571541 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb\": container with ID starting with 8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb not found: ID does not exist" containerID="8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.571569 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb"} err="failed to get container status \"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb\": rpc error: code = NotFound desc = could not find container \"8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb\": container with ID starting with 8828e5bde03eabd2f375714ef612f9ee7bbcd43071a52203149a932fbd83dcfb not found: ID does not exist" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.571589 4959 scope.go:117] "RemoveContainer" containerID="1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d" Oct 07 13:25:02 crc kubenswrapper[4959]: E1007 13:25:02.571877 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d\": container with ID starting with 1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d not found: ID does not exist" containerID="1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.571914 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d"} err="failed to get container status \"1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d\": rpc error: code = NotFound desc = could not find container \"1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d\": container with ID starting with 1e5f5672f0f76137bcfe3cdc973b00bc409440dc7c6a0ff68dd6e256fb058c8d not found: ID does not exist" Oct 07 13:25:02 crc kubenswrapper[4959]: I1007 13:25:02.829484 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" path="/var/lib/kubelet/pods/b61c1e97-9fcc-4116-aeb6-70ee63d38c3a/volumes" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.253252 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:19 crc kubenswrapper[4959]: E1007 13:25:19.254323 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="extract-utilities" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.254342 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="extract-utilities" Oct 07 13:25:19 crc kubenswrapper[4959]: E1007 13:25:19.254354 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="registry-server" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.254362 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="registry-server" Oct 07 13:25:19 crc kubenswrapper[4959]: E1007 13:25:19.254378 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="extract-content" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.254385 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="extract-content" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.254574 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61c1e97-9fcc-4116-aeb6-70ee63d38c3a" containerName="registry-server" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.255834 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.261370 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.373421 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.373485 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.373677 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hvvn\" (UniqueName: \"kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.475331 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.475398 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.475441 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hvvn\" (UniqueName: \"kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.475815 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.475949 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.496464 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hvvn\" (UniqueName: \"kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn\") pod \"redhat-marketplace-psln2\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:19 crc kubenswrapper[4959]: I1007 13:25:19.576461 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:20 crc kubenswrapper[4959]: I1007 13:25:20.029568 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:20 crc kubenswrapper[4959]: I1007 13:25:20.639830 4959 generic.go:334] "Generic (PLEG): container finished" podID="81228426-4dad-48f4-bc23-f3f2105346de" containerID="430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31" exitCode=0 Oct 07 13:25:20 crc kubenswrapper[4959]: I1007 13:25:20.639951 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerDied","Data":"430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31"} Oct 07 13:25:20 crc kubenswrapper[4959]: I1007 13:25:20.640263 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerStarted","Data":"6d753272e9b9b15d2750836658e2e261c187c061291b4edba59e3704840baec2"} Oct 07 13:25:21 crc kubenswrapper[4959]: I1007 13:25:21.652740 4959 generic.go:334] "Generic (PLEG): container finished" podID="81228426-4dad-48f4-bc23-f3f2105346de" containerID="6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f" exitCode=0 Oct 07 13:25:21 crc kubenswrapper[4959]: I1007 13:25:21.652827 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerDied","Data":"6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f"} Oct 07 13:25:22 crc kubenswrapper[4959]: I1007 13:25:22.663572 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerStarted","Data":"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544"} Oct 07 13:25:22 crc kubenswrapper[4959]: I1007 13:25:22.685061 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-psln2" podStartSLOduration=2.149187593 podStartE2EDuration="3.685043111s" podCreationTimestamp="2025-10-07 13:25:19 +0000 UTC" firstStartedPulling="2025-10-07 13:25:20.644878833 +0000 UTC m=+1472.805601510" lastFinishedPulling="2025-10-07 13:25:22.180734351 +0000 UTC m=+1474.341457028" observedRunningTime="2025-10-07 13:25:22.678116727 +0000 UTC m=+1474.838839394" watchObservedRunningTime="2025-10-07 13:25:22.685043111 +0000 UTC m=+1474.845765788" Oct 07 13:25:29 crc kubenswrapper[4959]: I1007 13:25:29.576555 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:29 crc kubenswrapper[4959]: I1007 13:25:29.577164 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:29 crc kubenswrapper[4959]: I1007 13:25:29.625266 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:29 crc kubenswrapper[4959]: I1007 13:25:29.769496 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:29 crc kubenswrapper[4959]: I1007 13:25:29.860814 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:31 crc kubenswrapper[4959]: I1007 13:25:31.758086 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-psln2" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="registry-server" containerID="cri-o://9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544" gracePeriod=2 Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.237970 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.404388 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hvvn\" (UniqueName: \"kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn\") pod \"81228426-4dad-48f4-bc23-f3f2105346de\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.404572 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content\") pod \"81228426-4dad-48f4-bc23-f3f2105346de\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.405029 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities\") pod \"81228426-4dad-48f4-bc23-f3f2105346de\" (UID: \"81228426-4dad-48f4-bc23-f3f2105346de\") " Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.405761 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities" (OuterVolumeSpecName: "utilities") pod "81228426-4dad-48f4-bc23-f3f2105346de" (UID: "81228426-4dad-48f4-bc23-f3f2105346de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.413214 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn" (OuterVolumeSpecName: "kube-api-access-4hvvn") pod "81228426-4dad-48f4-bc23-f3f2105346de" (UID: "81228426-4dad-48f4-bc23-f3f2105346de"). InnerVolumeSpecName "kube-api-access-4hvvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.425016 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81228426-4dad-48f4-bc23-f3f2105346de" (UID: "81228426-4dad-48f4-bc23-f3f2105346de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.508374 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.508448 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81228426-4dad-48f4-bc23-f3f2105346de-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.508467 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hvvn\" (UniqueName: \"kubernetes.io/projected/81228426-4dad-48f4-bc23-f3f2105346de-kube-api-access-4hvvn\") on node \"crc\" DevicePath \"\"" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.773032 4959 generic.go:334] "Generic (PLEG): container finished" podID="81228426-4dad-48f4-bc23-f3f2105346de" containerID="9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544" exitCode=0 Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.773234 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psln2" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.773231 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerDied","Data":"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544"} Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.774869 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psln2" event={"ID":"81228426-4dad-48f4-bc23-f3f2105346de","Type":"ContainerDied","Data":"6d753272e9b9b15d2750836658e2e261c187c061291b4edba59e3704840baec2"} Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.774891 4959 scope.go:117] "RemoveContainer" containerID="9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.807027 4959 scope.go:117] "RemoveContainer" containerID="6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.836809 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.840725 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-psln2"] Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.848883 4959 scope.go:117] "RemoveContainer" containerID="430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.884837 4959 scope.go:117] "RemoveContainer" containerID="9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544" Oct 07 13:25:32 crc kubenswrapper[4959]: E1007 13:25:32.885301 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544\": container with ID starting with 9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544 not found: ID does not exist" containerID="9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.885346 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544"} err="failed to get container status \"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544\": rpc error: code = NotFound desc = could not find container \"9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544\": container with ID starting with 9541a3512128e79e706a110d3d4ab3b07d450963fb24b773abbb0dfb8d359544 not found: ID does not exist" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.885377 4959 scope.go:117] "RemoveContainer" containerID="6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f" Oct 07 13:25:32 crc kubenswrapper[4959]: E1007 13:25:32.886093 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f\": container with ID starting with 6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f not found: ID does not exist" containerID="6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.886141 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f"} err="failed to get container status \"6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f\": rpc error: code = NotFound desc = could not find container \"6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f\": container with ID starting with 6e35139f34e95466f8bbe25beb130704fcee39d15ebf919bdc7d2a31502de62f not found: ID does not exist" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.886170 4959 scope.go:117] "RemoveContainer" containerID="430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31" Oct 07 13:25:32 crc kubenswrapper[4959]: E1007 13:25:32.886759 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31\": container with ID starting with 430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31 not found: ID does not exist" containerID="430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31" Oct 07 13:25:32 crc kubenswrapper[4959]: I1007 13:25:32.886808 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31"} err="failed to get container status \"430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31\": rpc error: code = NotFound desc = could not find container \"430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31\": container with ID starting with 430731cf022f96ce4b1c4348182509b8dadc0dd5c1fcecc09a79d07c98c93c31 not found: ID does not exist" Oct 07 13:25:34 crc kubenswrapper[4959]: I1007 13:25:34.826588 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81228426-4dad-48f4-bc23-f3f2105346de" path="/var/lib/kubelet/pods/81228426-4dad-48f4-bc23-f3f2105346de/volumes" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.115609 4959 generic.go:334] "Generic (PLEG): container finished" podID="36002010-873a-43db-bc92-b37c7eb7bf35" containerID="7bc0ba797e4bfb7465f2b0445fe04985722c6d7767d548ecb260f834bf6f8bc3" exitCode=0 Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.115667 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" event={"ID":"36002010-873a-43db-bc92-b37c7eb7bf35","Type":"ContainerDied","Data":"7bc0ba797e4bfb7465f2b0445fe04985722c6d7767d548ecb260f834bf6f8bc3"} Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.960074 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:04 crc kubenswrapper[4959]: E1007 13:26:04.961288 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="extract-utilities" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.961308 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="extract-utilities" Oct 07 13:26:04 crc kubenswrapper[4959]: E1007 13:26:04.961339 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="registry-server" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.961348 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="registry-server" Oct 07 13:26:04 crc kubenswrapper[4959]: E1007 13:26:04.961362 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="extract-content" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.961370 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="extract-content" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.961565 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="81228426-4dad-48f4-bc23-f3f2105346de" containerName="registry-server" Oct 07 13:26:04 crc kubenswrapper[4959]: I1007 13:26:04.962857 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.002209 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.118082 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.118164 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.118249 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2sq9\" (UniqueName: \"kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.219529 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.219603 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.219704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2sq9\" (UniqueName: \"kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.220145 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.220171 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.263652 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2sq9\" (UniqueName: \"kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9\") pod \"community-operators-p27d2\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.297293 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.669198 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.833042 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key\") pod \"36002010-873a-43db-bc92-b37c7eb7bf35\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.833102 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqq5p\" (UniqueName: \"kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p\") pod \"36002010-873a-43db-bc92-b37c7eb7bf35\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.833235 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory\") pod \"36002010-873a-43db-bc92-b37c7eb7bf35\" (UID: \"36002010-873a-43db-bc92-b37c7eb7bf35\") " Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.838534 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p" (OuterVolumeSpecName: "kube-api-access-mqq5p") pod "36002010-873a-43db-bc92-b37c7eb7bf35" (UID: "36002010-873a-43db-bc92-b37c7eb7bf35"). InnerVolumeSpecName "kube-api-access-mqq5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.858922 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory" (OuterVolumeSpecName: "inventory") pod "36002010-873a-43db-bc92-b37c7eb7bf35" (UID: "36002010-873a-43db-bc92-b37c7eb7bf35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.859244 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36002010-873a-43db-bc92-b37c7eb7bf35" (UID: "36002010-873a-43db-bc92-b37c7eb7bf35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.898652 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.935144 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.935178 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqq5p\" (UniqueName: \"kubernetes.io/projected/36002010-873a-43db-bc92-b37c7eb7bf35-kube-api-access-mqq5p\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:05 crc kubenswrapper[4959]: I1007 13:26:05.935191 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36002010-873a-43db-bc92-b37c7eb7bf35-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.134569 4959 generic.go:334] "Generic (PLEG): container finished" podID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerID="68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528" exitCode=0 Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.134660 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerDied","Data":"68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528"} Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.134729 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerStarted","Data":"15fb0d480f76105bd0d58aa11af6278593153587f6f1b150b011f561ba4b0061"} Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.137996 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" event={"ID":"36002010-873a-43db-bc92-b37c7eb7bf35","Type":"ContainerDied","Data":"69249e5ef20298ffd9e382deb1a49940d81626beca4765a5bf1d48553bbec44c"} Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.138037 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69249e5ef20298ffd9e382deb1a49940d81626beca4765a5bf1d48553bbec44c" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.138090 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.217224 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx"] Oct 07 13:26:06 crc kubenswrapper[4959]: E1007 13:26:06.217837 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36002010-873a-43db-bc92-b37c7eb7bf35" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.217944 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="36002010-873a-43db-bc92-b37c7eb7bf35" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.218183 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="36002010-873a-43db-bc92-b37c7eb7bf35" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.219230 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.223542 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.223679 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.223876 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.226118 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.234071 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx"] Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.340420 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.340496 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cs4k\" (UniqueName: \"kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.340545 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.478781 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cs4k\" (UniqueName: \"kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.478856 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.478981 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.484642 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.484801 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.494875 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cs4k\" (UniqueName: \"kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:06 crc kubenswrapper[4959]: I1007 13:26:06.536913 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:07 crc kubenswrapper[4959]: I1007 13:26:07.064991 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx"] Oct 07 13:26:07 crc kubenswrapper[4959]: W1007 13:26:07.068819 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b6c15c4_cb23_4875_83e9_2e5f2669d56a.slice/crio-82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865 WatchSource:0}: Error finding container 82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865: Status 404 returned error can't find the container with id 82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865 Oct 07 13:26:07 crc kubenswrapper[4959]: I1007 13:26:07.145650 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" event={"ID":"9b6c15c4-cb23-4875-83e9-2e5f2669d56a","Type":"ContainerStarted","Data":"82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865"} Oct 07 13:26:08 crc kubenswrapper[4959]: I1007 13:26:08.159910 4959 generic.go:334] "Generic (PLEG): container finished" podID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerID="2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59" exitCode=0 Oct 07 13:26:08 crc kubenswrapper[4959]: I1007 13:26:08.160022 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerDied","Data":"2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59"} Oct 07 13:26:08 crc kubenswrapper[4959]: I1007 13:26:08.177817 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" event={"ID":"9b6c15c4-cb23-4875-83e9-2e5f2669d56a","Type":"ContainerStarted","Data":"eb58a85a48861919ebc71e41aedeff86cd9a3f1465e2684e669ecf0d3d662a0b"} Oct 07 13:26:08 crc kubenswrapper[4959]: I1007 13:26:08.249579 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" podStartSLOduration=1.568383022 podStartE2EDuration="2.249548045s" podCreationTimestamp="2025-10-07 13:26:06 +0000 UTC" firstStartedPulling="2025-10-07 13:26:07.071562582 +0000 UTC m=+1519.232285259" lastFinishedPulling="2025-10-07 13:26:07.752727605 +0000 UTC m=+1519.913450282" observedRunningTime="2025-10-07 13:26:08.224230097 +0000 UTC m=+1520.384952804" watchObservedRunningTime="2025-10-07 13:26:08.249548045 +0000 UTC m=+1520.410270752" Oct 07 13:26:09 crc kubenswrapper[4959]: I1007 13:26:09.192067 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerStarted","Data":"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb"} Oct 07 13:26:09 crc kubenswrapper[4959]: I1007 13:26:09.225119 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p27d2" podStartSLOduration=2.635210472 podStartE2EDuration="5.225091918s" podCreationTimestamp="2025-10-07 13:26:04 +0000 UTC" firstStartedPulling="2025-10-07 13:26:06.136890251 +0000 UTC m=+1518.297612928" lastFinishedPulling="2025-10-07 13:26:08.726771687 +0000 UTC m=+1520.887494374" observedRunningTime="2025-10-07 13:26:09.2162094 +0000 UTC m=+1521.376932137" watchObservedRunningTime="2025-10-07 13:26:09.225091918 +0000 UTC m=+1521.385814635" Oct 07 13:26:10 crc kubenswrapper[4959]: I1007 13:26:10.052515 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-q9ncv"] Oct 07 13:26:10 crc kubenswrapper[4959]: I1007 13:26:10.060773 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-q9ncv"] Oct 07 13:26:10 crc kubenswrapper[4959]: I1007 13:26:10.823996 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6679f23b-b441-424d-9203-34c965f1e655" path="/var/lib/kubelet/pods/6679f23b-b441-424d-9203-34c965f1e655/volumes" Oct 07 13:26:13 crc kubenswrapper[4959]: I1007 13:26:13.237090 4959 generic.go:334] "Generic (PLEG): container finished" podID="9b6c15c4-cb23-4875-83e9-2e5f2669d56a" containerID="eb58a85a48861919ebc71e41aedeff86cd9a3f1465e2684e669ecf0d3d662a0b" exitCode=0 Oct 07 13:26:13 crc kubenswrapper[4959]: I1007 13:26:13.237174 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" event={"ID":"9b6c15c4-cb23-4875-83e9-2e5f2669d56a","Type":"ContainerDied","Data":"eb58a85a48861919ebc71e41aedeff86cd9a3f1465e2684e669ecf0d3d662a0b"} Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.040813 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-c2s2g"] Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.056929 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8zjb9"] Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.069292 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-c2s2g"] Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.077349 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8zjb9"] Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.638034 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.753812 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key\") pod \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.753931 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cs4k\" (UniqueName: \"kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k\") pod \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.753979 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory\") pod \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\" (UID: \"9b6c15c4-cb23-4875-83e9-2e5f2669d56a\") " Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.758950 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k" (OuterVolumeSpecName: "kube-api-access-4cs4k") pod "9b6c15c4-cb23-4875-83e9-2e5f2669d56a" (UID: "9b6c15c4-cb23-4875-83e9-2e5f2669d56a"). InnerVolumeSpecName "kube-api-access-4cs4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.780649 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b6c15c4-cb23-4875-83e9-2e5f2669d56a" (UID: "9b6c15c4-cb23-4875-83e9-2e5f2669d56a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.781117 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory" (OuterVolumeSpecName: "inventory") pod "9b6c15c4-cb23-4875-83e9-2e5f2669d56a" (UID: "9b6c15c4-cb23-4875-83e9-2e5f2669d56a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.821153 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae65892-33d2-4485-9586-318c8cc1b97f" path="/var/lib/kubelet/pods/2ae65892-33d2-4485-9586-318c8cc1b97f/volumes" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.822514 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00ade55-5009-425f-9c72-3f3b39cf32c5" path="/var/lib/kubelet/pods/a00ade55-5009-425f-9c72-3f3b39cf32c5/volumes" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.855987 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.856030 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:14 crc kubenswrapper[4959]: I1007 13:26:14.856044 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cs4k\" (UniqueName: \"kubernetes.io/projected/9b6c15c4-cb23-4875-83e9-2e5f2669d56a-kube-api-access-4cs4k\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.256354 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" event={"ID":"9b6c15c4-cb23-4875-83e9-2e5f2669d56a","Type":"ContainerDied","Data":"82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865"} Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.256388 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c1388cdad4338f1ca59e562d686708e0a14575633ddf2739f3c7e8e7184865" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.256464 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.298162 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.298495 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.341961 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf"] Oct 07 13:26:15 crc kubenswrapper[4959]: E1007 13:26:15.342308 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6c15c4-cb23-4875-83e9-2e5f2669d56a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.342326 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6c15c4-cb23-4875-83e9-2e5f2669d56a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.342528 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6c15c4-cb23-4875-83e9-2e5f2669d56a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.343313 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.346289 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.346887 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.346895 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.348398 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.356771 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf"] Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.372896 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ww7s\" (UniqueName: \"kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.372947 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.372972 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.386902 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.475859 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.476160 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.476312 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ww7s\" (UniqueName: \"kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.482836 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.491662 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.507059 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ww7s\" (UniqueName: \"kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgtzf\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:15 crc kubenswrapper[4959]: I1007 13:26:15.663033 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:16 crc kubenswrapper[4959]: I1007 13:26:16.276972 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf"] Oct 07 13:26:16 crc kubenswrapper[4959]: I1007 13:26:16.303125 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:26:16 crc kubenswrapper[4959]: I1007 13:26:16.335194 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:16 crc kubenswrapper[4959]: I1007 13:26:16.395072 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:17 crc kubenswrapper[4959]: I1007 13:26:17.280751 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" event={"ID":"23e3d7e0-dbba-4eb6-ac01-885b020435ee","Type":"ContainerStarted","Data":"1fa5c56c06fa7f6ce291c46fafd5542c6ad8e1ffc3dbcde1553096c841416829"} Oct 07 13:26:17 crc kubenswrapper[4959]: I1007 13:26:17.281096 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" event={"ID":"23e3d7e0-dbba-4eb6-ac01-885b020435ee","Type":"ContainerStarted","Data":"16713ed3116c7dc4129d11867d25628dbafd8bce2ed8c8399ab275a8e0b41e8b"} Oct 07 13:26:17 crc kubenswrapper[4959]: I1007 13:26:17.304456 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" podStartSLOduration=1.764238019 podStartE2EDuration="2.30442446s" podCreationTimestamp="2025-10-07 13:26:15 +0000 UTC" firstStartedPulling="2025-10-07 13:26:16.302811659 +0000 UTC m=+1528.463534336" lastFinishedPulling="2025-10-07 13:26:16.84299807 +0000 UTC m=+1529.003720777" observedRunningTime="2025-10-07 13:26:17.302548628 +0000 UTC m=+1529.463271335" watchObservedRunningTime="2025-10-07 13:26:17.30442446 +0000 UTC m=+1529.465147177" Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.292189 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p27d2" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="registry-server" containerID="cri-o://17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb" gracePeriod=2 Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.771623 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.947032 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2sq9\" (UniqueName: \"kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9\") pod \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.947191 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities\") pod \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.947223 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content\") pod \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\" (UID: \"4bc2a6fd-da94-4780-b44c-574f2b45d3af\") " Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.948120 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities" (OuterVolumeSpecName: "utilities") pod "4bc2a6fd-da94-4780-b44c-574f2b45d3af" (UID: "4bc2a6fd-da94-4780-b44c-574f2b45d3af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.949733 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:18 crc kubenswrapper[4959]: I1007 13:26:18.954485 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9" (OuterVolumeSpecName: "kube-api-access-j2sq9") pod "4bc2a6fd-da94-4780-b44c-574f2b45d3af" (UID: "4bc2a6fd-da94-4780-b44c-574f2b45d3af"). InnerVolumeSpecName "kube-api-access-j2sq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.051227 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2sq9\" (UniqueName: \"kubernetes.io/projected/4bc2a6fd-da94-4780-b44c-574f2b45d3af-kube-api-access-j2sq9\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.303848 4959 generic.go:334] "Generic (PLEG): container finished" podID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerID="17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb" exitCode=0 Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.303896 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerDied","Data":"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb"} Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.303928 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p27d2" event={"ID":"4bc2a6fd-da94-4780-b44c-574f2b45d3af","Type":"ContainerDied","Data":"15fb0d480f76105bd0d58aa11af6278593153587f6f1b150b011f561ba4b0061"} Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.303953 4959 scope.go:117] "RemoveContainer" containerID="17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.304224 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p27d2" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.341845 4959 scope.go:117] "RemoveContainer" containerID="2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.383038 4959 scope.go:117] "RemoveContainer" containerID="68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.442942 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bc2a6fd-da94-4780-b44c-574f2b45d3af" (UID: "4bc2a6fd-da94-4780-b44c-574f2b45d3af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.446137 4959 scope.go:117] "RemoveContainer" containerID="17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb" Oct 07 13:26:19 crc kubenswrapper[4959]: E1007 13:26:19.446984 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb\": container with ID starting with 17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb not found: ID does not exist" containerID="17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.447096 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb"} err="failed to get container status \"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb\": rpc error: code = NotFound desc = could not find container \"17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb\": container with ID starting with 17c14851ced41963470922eafff551f57b58b0d12227fe3f43d6f10fcc5da8cb not found: ID does not exist" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.447153 4959 scope.go:117] "RemoveContainer" containerID="2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59" Oct 07 13:26:19 crc kubenswrapper[4959]: E1007 13:26:19.447952 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59\": container with ID starting with 2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59 not found: ID does not exist" containerID="2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.448004 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59"} err="failed to get container status \"2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59\": rpc error: code = NotFound desc = could not find container \"2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59\": container with ID starting with 2acb2fb992eb4b54b0ff79350fbce81cfccbf469bf4bd95947c51ed882ec9d59 not found: ID does not exist" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.448037 4959 scope.go:117] "RemoveContainer" containerID="68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528" Oct 07 13:26:19 crc kubenswrapper[4959]: E1007 13:26:19.448664 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528\": container with ID starting with 68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528 not found: ID does not exist" containerID="68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.448739 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528"} err="failed to get container status \"68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528\": rpc error: code = NotFound desc = could not find container \"68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528\": container with ID starting with 68ba50d9db9d1eb70b6ba57e1eb26444455e080f719971f4f8f32674b7583528 not found: ID does not exist" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.466586 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2a6fd-da94-4780-b44c-574f2b45d3af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.659344 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:19 crc kubenswrapper[4959]: I1007 13:26:19.670276 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p27d2"] Oct 07 13:26:20 crc kubenswrapper[4959]: I1007 13:26:20.035452 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3992-account-create-p5rn5"] Oct 07 13:26:20 crc kubenswrapper[4959]: I1007 13:26:20.056859 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3992-account-create-p5rn5"] Oct 07 13:26:20 crc kubenswrapper[4959]: I1007 13:26:20.832730 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10da62a9-350c-4749-b6d7-481dc5557926" path="/var/lib/kubelet/pods/10da62a9-350c-4749-b6d7-481dc5557926/volumes" Oct 07 13:26:20 crc kubenswrapper[4959]: I1007 13:26:20.833895 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" path="/var/lib/kubelet/pods/4bc2a6fd-da94-4780-b44c-574f2b45d3af/volumes" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.802935 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:21 crc kubenswrapper[4959]: E1007 13:26:21.803284 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="extract-utilities" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.803296 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="extract-utilities" Oct 07 13:26:21 crc kubenswrapper[4959]: E1007 13:26:21.803311 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="extract-content" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.803316 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="extract-content" Oct 07 13:26:21 crc kubenswrapper[4959]: E1007 13:26:21.803332 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="registry-server" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.803343 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="registry-server" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.803508 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc2a6fd-da94-4780-b44c-574f2b45d3af" containerName="registry-server" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.804845 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.830781 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.911667 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.911749 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7dl\" (UniqueName: \"kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:21 crc kubenswrapper[4959]: I1007 13:26:21.911769 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.013611 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.013988 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7dl\" (UniqueName: \"kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.014012 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.014485 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.014498 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.041243 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7dl\" (UniqueName: \"kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl\") pod \"redhat-operators-g8bdr\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.149153 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:22 crc kubenswrapper[4959]: I1007 13:26:22.632500 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:23 crc kubenswrapper[4959]: I1007 13:26:23.348346 4959 generic.go:334] "Generic (PLEG): container finished" podID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerID="0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05" exitCode=0 Oct 07 13:26:23 crc kubenswrapper[4959]: I1007 13:26:23.348443 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerDied","Data":"0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05"} Oct 07 13:26:23 crc kubenswrapper[4959]: I1007 13:26:23.348728 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerStarted","Data":"13ef6956954b9c19cb2d546fc5588ead033cf608cb96c8c7fb50d9f9e876af56"} Oct 07 13:26:24 crc kubenswrapper[4959]: I1007 13:26:24.362655 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerStarted","Data":"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd"} Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.035611 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8d28-account-create-rk7w2"] Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.045557 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a896-account-create-6kkkh"] Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.054947 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8d28-account-create-rk7w2"] Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.063864 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a896-account-create-6kkkh"] Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.384291 4959 generic.go:334] "Generic (PLEG): container finished" podID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerID="e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd" exitCode=0 Oct 07 13:26:25 crc kubenswrapper[4959]: I1007 13:26:25.384329 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerDied","Data":"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd"} Oct 07 13:26:26 crc kubenswrapper[4959]: I1007 13:26:26.829621 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb5bed5-58f4-47f5-9594-4bb0a951afa9" path="/var/lib/kubelet/pods/0fb5bed5-58f4-47f5-9594-4bb0a951afa9/volumes" Oct 07 13:26:26 crc kubenswrapper[4959]: I1007 13:26:26.831395 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435f9254-a99d-4d4a-831e-481261eb91b1" path="/var/lib/kubelet/pods/435f9254-a99d-4d4a-831e-481261eb91b1/volumes" Oct 07 13:26:27 crc kubenswrapper[4959]: I1007 13:26:27.411879 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerStarted","Data":"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd"} Oct 07 13:26:27 crc kubenswrapper[4959]: I1007 13:26:27.448562 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g8bdr" podStartSLOduration=3.989736708 podStartE2EDuration="6.448542349s" podCreationTimestamp="2025-10-07 13:26:21 +0000 UTC" firstStartedPulling="2025-10-07 13:26:23.35012504 +0000 UTC m=+1535.510847717" lastFinishedPulling="2025-10-07 13:26:25.808930651 +0000 UTC m=+1537.969653358" observedRunningTime="2025-10-07 13:26:27.436980665 +0000 UTC m=+1539.597703372" watchObservedRunningTime="2025-10-07 13:26:27.448542349 +0000 UTC m=+1539.609265036" Oct 07 13:26:32 crc kubenswrapper[4959]: I1007 13:26:32.149925 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:32 crc kubenswrapper[4959]: I1007 13:26:32.150946 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:32 crc kubenswrapper[4959]: I1007 13:26:32.214725 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:32 crc kubenswrapper[4959]: I1007 13:26:32.545998 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:32 crc kubenswrapper[4959]: I1007 13:26:32.608882 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:34 crc kubenswrapper[4959]: I1007 13:26:34.498949 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g8bdr" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="registry-server" containerID="cri-o://048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd" gracePeriod=2 Oct 07 13:26:34 crc kubenswrapper[4959]: I1007 13:26:34.974673 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.003445 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content\") pod \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.003604 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp7dl\" (UniqueName: \"kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl\") pod \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.003848 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities\") pod \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\" (UID: \"5f0b5626-ddf3-406b-974f-71a5d07bf03a\") " Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.005757 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities" (OuterVolumeSpecName: "utilities") pod "5f0b5626-ddf3-406b-974f-71a5d07bf03a" (UID: "5f0b5626-ddf3-406b-974f-71a5d07bf03a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.012085 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl" (OuterVolumeSpecName: "kube-api-access-bp7dl") pod "5f0b5626-ddf3-406b-974f-71a5d07bf03a" (UID: "5f0b5626-ddf3-406b-974f-71a5d07bf03a"). InnerVolumeSpecName "kube-api-access-bp7dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.087922 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f0b5626-ddf3-406b-974f-71a5d07bf03a" (UID: "5f0b5626-ddf3-406b-974f-71a5d07bf03a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.106491 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.106526 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp7dl\" (UniqueName: \"kubernetes.io/projected/5f0b5626-ddf3-406b-974f-71a5d07bf03a-kube-api-access-bp7dl\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.106539 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0b5626-ddf3-406b-974f-71a5d07bf03a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.512816 4959 generic.go:334] "Generic (PLEG): container finished" podID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerID="048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd" exitCode=0 Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.512855 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerDied","Data":"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd"} Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.512882 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8bdr" event={"ID":"5f0b5626-ddf3-406b-974f-71a5d07bf03a","Type":"ContainerDied","Data":"13ef6956954b9c19cb2d546fc5588ead033cf608cb96c8c7fb50d9f9e876af56"} Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.512901 4959 scope.go:117] "RemoveContainer" containerID="048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.512927 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8bdr" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.561561 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.563611 4959 scope.go:117] "RemoveContainer" containerID="e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.570150 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g8bdr"] Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.585237 4959 scope.go:117] "RemoveContainer" containerID="0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.659238 4959 scope.go:117] "RemoveContainer" containerID="048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd" Oct 07 13:26:35 crc kubenswrapper[4959]: E1007 13:26:35.659961 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd\": container with ID starting with 048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd not found: ID does not exist" containerID="048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.660031 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd"} err="failed to get container status \"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd\": rpc error: code = NotFound desc = could not find container \"048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd\": container with ID starting with 048cf59e385c67f71f6b195bebbad5bcc1689e23eb53eb4513614f75dbacbafd not found: ID does not exist" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.660061 4959 scope.go:117] "RemoveContainer" containerID="e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd" Oct 07 13:26:35 crc kubenswrapper[4959]: E1007 13:26:35.661162 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd\": container with ID starting with e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd not found: ID does not exist" containerID="e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.661272 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd"} err="failed to get container status \"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd\": rpc error: code = NotFound desc = could not find container \"e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd\": container with ID starting with e1739a35d0f82772017a8d45931448e1879fa530ba5832cc96620a2bc1b39afd not found: ID does not exist" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.661362 4959 scope.go:117] "RemoveContainer" containerID="0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05" Oct 07 13:26:35 crc kubenswrapper[4959]: E1007 13:26:35.661888 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05\": container with ID starting with 0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05 not found: ID does not exist" containerID="0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05" Oct 07 13:26:35 crc kubenswrapper[4959]: I1007 13:26:35.661956 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05"} err="failed to get container status \"0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05\": rpc error: code = NotFound desc = could not find container \"0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05\": container with ID starting with 0c4e354ee5ba4b54b4be53225d22320bd96ad8950a3104203a46f0f9c5952c05 not found: ID does not exist" Oct 07 13:26:36 crc kubenswrapper[4959]: I1007 13:26:36.823949 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" path="/var/lib/kubelet/pods/5f0b5626-ddf3-406b-974f-71a5d07bf03a/volumes" Oct 07 13:26:37 crc kubenswrapper[4959]: I1007 13:26:37.695666 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:26:37 crc kubenswrapper[4959]: I1007 13:26:37.695744 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.066398 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mr66q"] Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.078828 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nwskk"] Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.088791 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-cxjwg"] Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.098697 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nwskk"] Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.107268 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mr66q"] Oct 07 13:26:43 crc kubenswrapper[4959]: I1007 13:26:43.114089 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-cxjwg"] Oct 07 13:26:44 crc kubenswrapper[4959]: I1007 13:26:44.825029 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c95289-a526-40de-93ff-f7232cb3bf90" path="/var/lib/kubelet/pods/18c95289-a526-40de-93ff-f7232cb3bf90/volumes" Oct 07 13:26:44 crc kubenswrapper[4959]: I1007 13:26:44.826680 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2cf091-11f1-43ed-9f57-bb01bc99da1f" path="/var/lib/kubelet/pods/3c2cf091-11f1-43ed-9f57-bb01bc99da1f/volumes" Oct 07 13:26:44 crc kubenswrapper[4959]: I1007 13:26:44.827919 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfba247-ed13-4e8a-a125-f5b94bef38f6" path="/var/lib/kubelet/pods/fdfba247-ed13-4e8a-a125-f5b94bef38f6/volumes" Oct 07 13:26:46 crc kubenswrapper[4959]: I1007 13:26:46.048844 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-45zn7"] Oct 07 13:26:46 crc kubenswrapper[4959]: I1007 13:26:46.055914 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-45zn7"] Oct 07 13:26:46 crc kubenswrapper[4959]: I1007 13:26:46.825336 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4" path="/var/lib/kubelet/pods/6ce87e98-33a9-4f7e-bd9c-5981f2f0f7d4/volumes" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.032618 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8phz8"] Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.045479 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8phz8"] Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.602693 4959 scope.go:117] "RemoveContainer" containerID="e63e589e33b68dbf95af6e5ba72c8d0fd46b1557434da8fd8f1c44f65d251292" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.624970 4959 scope.go:117] "RemoveContainer" containerID="1f2edbdc0900de2e167bdccde230acd02b205f2c5404e416842254a8334783cd" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.675102 4959 scope.go:117] "RemoveContainer" containerID="0d9b2c9488913ed0ad20bd2ca888538296debc9f01a93aff8617f44dedccc919" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.720845 4959 scope.go:117] "RemoveContainer" containerID="b82918d4f777b85c6f1f5ec435fc596dd1e6e4f5a09c2dbd2e07e0f6494e4247" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.760023 4959 scope.go:117] "RemoveContainer" containerID="4994e23adad59193118f7e57755c69a99e64b4489977d830f6eae4a5d5d3111b" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.801950 4959 scope.go:117] "RemoveContainer" containerID="ae1015e6de0fafa33d796250f5a1054d57387ce329d67215c3e8b718f5d0aebd" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.842453 4959 scope.go:117] "RemoveContainer" containerID="baf04b947eb480c3db9ea8e9c55c153df1a394062e3591eedefc31a150af12ec" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.868260 4959 scope.go:117] "RemoveContainer" containerID="7dc63e1f3e52e733bb6a6edbb21c7ac9a11b7ab55869c4a2be689879995fc866" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.888598 4959 scope.go:117] "RemoveContainer" containerID="1edefd15b57f4239069b4c50373d3b1bb37ae8f3393f1d88660a103bcea27a79" Oct 07 13:26:51 crc kubenswrapper[4959]: I1007 13:26:51.924732 4959 scope.go:117] "RemoveContainer" containerID="b5e9272db5738235bad8f59f8b86398988648469419581dce9e25fede80f2e4b" Oct 07 13:26:52 crc kubenswrapper[4959]: I1007 13:26:52.824124 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9b58f4-6d48-4be8-b788-386f6c267440" path="/var/lib/kubelet/pods/fd9b58f4-6d48-4be8-b788-386f6c267440/volumes" Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.028373 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-44fa-account-create-wvt5v"] Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.038418 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4df3-account-create-g4qsz"] Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.047643 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-44fa-account-create-wvt5v"] Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.054234 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4df3-account-create-g4qsz"] Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.697245 4959 generic.go:334] "Generic (PLEG): container finished" podID="23e3d7e0-dbba-4eb6-ac01-885b020435ee" containerID="1fa5c56c06fa7f6ce291c46fafd5542c6ad8e1ffc3dbcde1553096c841416829" exitCode=0 Oct 07 13:26:53 crc kubenswrapper[4959]: I1007 13:26:53.697282 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" event={"ID":"23e3d7e0-dbba-4eb6-ac01-885b020435ee","Type":"ContainerDied","Data":"1fa5c56c06fa7f6ce291c46fafd5542c6ad8e1ffc3dbcde1553096c841416829"} Oct 07 13:26:54 crc kubenswrapper[4959]: I1007 13:26:54.029753 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c298-account-create-qkt7r"] Oct 07 13:26:54 crc kubenswrapper[4959]: I1007 13:26:54.038197 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c298-account-create-qkt7r"] Oct 07 13:26:54 crc kubenswrapper[4959]: I1007 13:26:54.821510 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38640b66-4901-479c-ade1-65fe23e63db6" path="/var/lib/kubelet/pods/38640b66-4901-479c-ade1-65fe23e63db6/volumes" Oct 07 13:26:54 crc kubenswrapper[4959]: I1007 13:26:54.823758 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a6fe32-b9be-4678-acc7-9966256aa15d" path="/var/lib/kubelet/pods/90a6fe32-b9be-4678-acc7-9966256aa15d/volumes" Oct 07 13:26:54 crc kubenswrapper[4959]: I1007 13:26:54.824406 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01545ad-9354-4300-b539-c48ec9ff1862" path="/var/lib/kubelet/pods/d01545ad-9354-4300-b539-c48ec9ff1862/volumes" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.091328 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.286079 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ww7s\" (UniqueName: \"kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s\") pod \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.286162 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory\") pod \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.286193 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key\") pod \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\" (UID: \"23e3d7e0-dbba-4eb6-ac01-885b020435ee\") " Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.297379 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s" (OuterVolumeSpecName: "kube-api-access-2ww7s") pod "23e3d7e0-dbba-4eb6-ac01-885b020435ee" (UID: "23e3d7e0-dbba-4eb6-ac01-885b020435ee"). InnerVolumeSpecName "kube-api-access-2ww7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.311402 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23e3d7e0-dbba-4eb6-ac01-885b020435ee" (UID: "23e3d7e0-dbba-4eb6-ac01-885b020435ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.312126 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory" (OuterVolumeSpecName: "inventory") pod "23e3d7e0-dbba-4eb6-ac01-885b020435ee" (UID: "23e3d7e0-dbba-4eb6-ac01-885b020435ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.387765 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ww7s\" (UniqueName: \"kubernetes.io/projected/23e3d7e0-dbba-4eb6-ac01-885b020435ee-kube-api-access-2ww7s\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.387793 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.387803 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23e3d7e0-dbba-4eb6-ac01-885b020435ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.717291 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.717775 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf" event={"ID":"23e3d7e0-dbba-4eb6-ac01-885b020435ee","Type":"ContainerDied","Data":"16713ed3116c7dc4129d11867d25628dbafd8bce2ed8c8399ab275a8e0b41e8b"} Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.717842 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16713ed3116c7dc4129d11867d25628dbafd8bce2ed8c8399ab275a8e0b41e8b" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789098 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck"] Oct 07 13:26:55 crc kubenswrapper[4959]: E1007 13:26:55.789494 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="extract-content" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789516 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="extract-content" Oct 07 13:26:55 crc kubenswrapper[4959]: E1007 13:26:55.789537 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="registry-server" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789544 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="registry-server" Oct 07 13:26:55 crc kubenswrapper[4959]: E1007 13:26:55.789566 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e3d7e0-dbba-4eb6-ac01-885b020435ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789578 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e3d7e0-dbba-4eb6-ac01-885b020435ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:55 crc kubenswrapper[4959]: E1007 13:26:55.789588 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="extract-utilities" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789594 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="extract-utilities" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789840 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e3d7e0-dbba-4eb6-ac01-885b020435ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.789865 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0b5626-ddf3-406b-974f-71a5d07bf03a" containerName="registry-server" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.790443 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.795053 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.795333 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.795335 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.795482 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.803040 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck"] Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.897827 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbm8j\" (UniqueName: \"kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.897874 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:55 crc kubenswrapper[4959]: I1007 13:26:55.897905 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.000135 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.001256 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.001789 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbm8j\" (UniqueName: \"kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.008615 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.008931 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.021902 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbm8j\" (UniqueName: \"kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.117341 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.624135 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck"] Oct 07 13:26:56 crc kubenswrapper[4959]: I1007 13:26:56.725837 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" event={"ID":"1ccca019-7eaf-4648-89f0-10795926e8c4","Type":"ContainerStarted","Data":"ee9be6a51b2813472e83d769917305693f2b1bb6af20c13f8fc3f20f8dc92b97"} Oct 07 13:26:57 crc kubenswrapper[4959]: I1007 13:26:57.735116 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" event={"ID":"1ccca019-7eaf-4648-89f0-10795926e8c4","Type":"ContainerStarted","Data":"71f031dfb4d144077375134402fdd0f36301cc31b3de43111de19a5b0598224b"} Oct 07 13:26:57 crc kubenswrapper[4959]: I1007 13:26:57.757675 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" podStartSLOduration=2.182917622 podStartE2EDuration="2.757655702s" podCreationTimestamp="2025-10-07 13:26:55 +0000 UTC" firstStartedPulling="2025-10-07 13:26:56.629924215 +0000 UTC m=+1568.790646882" lastFinishedPulling="2025-10-07 13:26:57.204662285 +0000 UTC m=+1569.365384962" observedRunningTime="2025-10-07 13:26:57.751415266 +0000 UTC m=+1569.912137943" watchObservedRunningTime="2025-10-07 13:26:57.757655702 +0000 UTC m=+1569.918378379" Oct 07 13:27:01 crc kubenswrapper[4959]: I1007 13:27:01.772182 4959 generic.go:334] "Generic (PLEG): container finished" podID="1ccca019-7eaf-4648-89f0-10795926e8c4" containerID="71f031dfb4d144077375134402fdd0f36301cc31b3de43111de19a5b0598224b" exitCode=0 Oct 07 13:27:01 crc kubenswrapper[4959]: I1007 13:27:01.772258 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" event={"ID":"1ccca019-7eaf-4648-89f0-10795926e8c4","Type":"ContainerDied","Data":"71f031dfb4d144077375134402fdd0f36301cc31b3de43111de19a5b0598224b"} Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.161044 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.230312 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbm8j\" (UniqueName: \"kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j\") pod \"1ccca019-7eaf-4648-89f0-10795926e8c4\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.230505 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory\") pod \"1ccca019-7eaf-4648-89f0-10795926e8c4\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.230615 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key\") pod \"1ccca019-7eaf-4648-89f0-10795926e8c4\" (UID: \"1ccca019-7eaf-4648-89f0-10795926e8c4\") " Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.253530 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j" (OuterVolumeSpecName: "kube-api-access-xbm8j") pod "1ccca019-7eaf-4648-89f0-10795926e8c4" (UID: "1ccca019-7eaf-4648-89f0-10795926e8c4"). InnerVolumeSpecName "kube-api-access-xbm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.260797 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ccca019-7eaf-4648-89f0-10795926e8c4" (UID: "1ccca019-7eaf-4648-89f0-10795926e8c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.268789 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory" (OuterVolumeSpecName: "inventory") pod "1ccca019-7eaf-4648-89f0-10795926e8c4" (UID: "1ccca019-7eaf-4648-89f0-10795926e8c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.332260 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.332298 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ccca019-7eaf-4648-89f0-10795926e8c4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.332312 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbm8j\" (UniqueName: \"kubernetes.io/projected/1ccca019-7eaf-4648-89f0-10795926e8c4-kube-api-access-xbm8j\") on node \"crc\" DevicePath \"\"" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.799430 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" event={"ID":"1ccca019-7eaf-4648-89f0-10795926e8c4","Type":"ContainerDied","Data":"ee9be6a51b2813472e83d769917305693f2b1bb6af20c13f8fc3f20f8dc92b97"} Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.799476 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9be6a51b2813472e83d769917305693f2b1bb6af20c13f8fc3f20f8dc92b97" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.799548 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.864048 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk"] Oct 07 13:27:03 crc kubenswrapper[4959]: E1007 13:27:03.864467 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccca019-7eaf-4648-89f0-10795926e8c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.864480 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccca019-7eaf-4648-89f0-10795926e8c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.864726 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ccca019-7eaf-4648-89f0-10795926e8c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.865470 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.867841 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.868605 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.868858 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.869043 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:27:03 crc kubenswrapper[4959]: I1007 13:27:03.879960 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk"] Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.046960 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.047365 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwf7n\" (UniqueName: \"kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.047399 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.148828 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.149538 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwf7n\" (UniqueName: \"kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.149573 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.155446 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.161475 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.171211 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwf7n\" (UniqueName: \"kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d47kk\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.185837 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.683715 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk"] Oct 07 13:27:04 crc kubenswrapper[4959]: I1007 13:27:04.821137 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" event={"ID":"9414071e-bead-487b-b728-68a86a02118f","Type":"ContainerStarted","Data":"f8fc044fd392297b43ae64b556e9cbfeb382cb8e338690b291d4d0f965ae93d2"} Oct 07 13:27:05 crc kubenswrapper[4959]: I1007 13:27:05.818296 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" event={"ID":"9414071e-bead-487b-b728-68a86a02118f","Type":"ContainerStarted","Data":"73b81f768b5da76e100442f7d2d698de5c45908e61f9fb85ec64bbf5596d7c3a"} Oct 07 13:27:05 crc kubenswrapper[4959]: I1007 13:27:05.861848 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" podStartSLOduration=2.442556111 podStartE2EDuration="2.861831049s" podCreationTimestamp="2025-10-07 13:27:03 +0000 UTC" firstStartedPulling="2025-10-07 13:27:04.690929365 +0000 UTC m=+1576.851652042" lastFinishedPulling="2025-10-07 13:27:05.110204303 +0000 UTC m=+1577.270926980" observedRunningTime="2025-10-07 13:27:05.853233846 +0000 UTC m=+1578.013956523" watchObservedRunningTime="2025-10-07 13:27:05.861831049 +0000 UTC m=+1578.022553726" Oct 07 13:27:07 crc kubenswrapper[4959]: I1007 13:27:07.696547 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:27:07 crc kubenswrapper[4959]: I1007 13:27:07.697291 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:27:20 crc kubenswrapper[4959]: I1007 13:27:20.044896 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-vcm4z"] Oct 07 13:27:20 crc kubenswrapper[4959]: I1007 13:27:20.058442 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-vcm4z"] Oct 07 13:27:20 crc kubenswrapper[4959]: I1007 13:27:20.849059 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1504679e-c96c-4491-9c41-fd003beb5296" path="/var/lib/kubelet/pods/1504679e-c96c-4491-9c41-fd003beb5296/volumes" Oct 07 13:27:29 crc kubenswrapper[4959]: I1007 13:27:29.033711 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-strcc"] Oct 07 13:27:29 crc kubenswrapper[4959]: I1007 13:27:29.044477 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-strcc"] Oct 07 13:27:30 crc kubenswrapper[4959]: I1007 13:27:30.823152 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaeb69e1-bac6-448b-832e-d6d32a47547a" path="/var/lib/kubelet/pods/aaeb69e1-bac6-448b-832e-d6d32a47547a/volumes" Oct 07 13:27:34 crc kubenswrapper[4959]: I1007 13:27:34.031587 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kcfg6"] Oct 07 13:27:34 crc kubenswrapper[4959]: I1007 13:27:34.038386 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kcfg6"] Oct 07 13:27:34 crc kubenswrapper[4959]: I1007 13:27:34.823050 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d" path="/var/lib/kubelet/pods/8bc9e3c3-04dc-4c8a-8ff1-d0aa99d3b91d/volumes" Oct 07 13:27:36 crc kubenswrapper[4959]: I1007 13:27:36.030392 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5j2t5"] Oct 07 13:27:36 crc kubenswrapper[4959]: I1007 13:27:36.036923 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5j2t5"] Oct 07 13:27:36 crc kubenswrapper[4959]: I1007 13:27:36.823375 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5722dd5-41d2-40bd-bd65-e57d7567ecf7" path="/var/lib/kubelet/pods/c5722dd5-41d2-40bd-bd65-e57d7567ecf7/volumes" Oct 07 13:27:37 crc kubenswrapper[4959]: I1007 13:27:37.695924 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:27:37 crc kubenswrapper[4959]: I1007 13:27:37.696274 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:27:37 crc kubenswrapper[4959]: I1007 13:27:37.696322 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:27:37 crc kubenswrapper[4959]: I1007 13:27:37.697036 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:27:37 crc kubenswrapper[4959]: I1007 13:27:37.697092 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" gracePeriod=600 Oct 07 13:27:37 crc kubenswrapper[4959]: E1007 13:27:37.817093 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:27:38 crc kubenswrapper[4959]: I1007 13:27:38.108073 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" exitCode=0 Oct 07 13:27:38 crc kubenswrapper[4959]: I1007 13:27:38.108117 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a"} Oct 07 13:27:38 crc kubenswrapper[4959]: I1007 13:27:38.108155 4959 scope.go:117] "RemoveContainer" containerID="27ace40315804865739527b95af409b39ad16c231be4b707c59ad02e2e723a6d" Oct 07 13:27:38 crc kubenswrapper[4959]: I1007 13:27:38.108784 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:27:38 crc kubenswrapper[4959]: E1007 13:27:38.109158 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:27:49 crc kubenswrapper[4959]: I1007 13:27:49.810278 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:27:49 crc kubenswrapper[4959]: E1007 13:27:49.811033 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.113878 4959 scope.go:117] "RemoveContainer" containerID="a27bcfc95ac6fade11d6ff4d4c83a2ab22cda6fbc673538c54442d9cc6f8cfcf" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.147899 4959 scope.go:117] "RemoveContainer" containerID="a6a116d4990e4e4662b1890b5d66e691092d7c189d2a2eaa6e54657aa34b805e" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.193769 4959 scope.go:117] "RemoveContainer" containerID="28650878bfd8bce39078769be19fa9e1091264b087e756d0cac0fc335d5b2582" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.282296 4959 scope.go:117] "RemoveContainer" containerID="571142fd1b7a88f5f0f8323aed6243b717f68683c07421e58211dfd7cf77865e" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.302716 4959 scope.go:117] "RemoveContainer" containerID="ae0ef4eef7f7669587b38db33c20ef12611e18ca5de024ecaefb1de294baecf8" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.335647 4959 scope.go:117] "RemoveContainer" containerID="f6564aeab92facf5badef9144c1a43b87bb55b0e6c8516f790beab8b34dabda3" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.375586 4959 scope.go:117] "RemoveContainer" containerID="708a9382a473495a0e7a1f5893576bfb745e62b1c85d24f08e67f2a93cbbbdae" Oct 07 13:27:52 crc kubenswrapper[4959]: I1007 13:27:52.395406 4959 scope.go:117] "RemoveContainer" containerID="16a234b398c31c4095d4a34efd0b605e7ad2c3196abcde28ef2ad08e3f8c57a3" Oct 07 13:27:53 crc kubenswrapper[4959]: I1007 13:27:53.085138 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qkdbw"] Oct 07 13:27:53 crc kubenswrapper[4959]: I1007 13:27:53.095852 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qkdbw"] Oct 07 13:27:54 crc kubenswrapper[4959]: I1007 13:27:54.822193 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd267601-4074-4cfb-8b40-8cd5fa12917c" path="/var/lib/kubelet/pods/bd267601-4074-4cfb-8b40-8cd5fa12917c/volumes" Oct 07 13:28:00 crc kubenswrapper[4959]: I1007 13:28:00.327345 4959 generic.go:334] "Generic (PLEG): container finished" podID="9414071e-bead-487b-b728-68a86a02118f" containerID="73b81f768b5da76e100442f7d2d698de5c45908e61f9fb85ec64bbf5596d7c3a" exitCode=2 Oct 07 13:28:00 crc kubenswrapper[4959]: I1007 13:28:00.327509 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" event={"ID":"9414071e-bead-487b-b728-68a86a02118f","Type":"ContainerDied","Data":"73b81f768b5da76e100442f7d2d698de5c45908e61f9fb85ec64bbf5596d7c3a"} Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.758821 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.884780 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwf7n\" (UniqueName: \"kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n\") pod \"9414071e-bead-487b-b728-68a86a02118f\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.885059 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory\") pod \"9414071e-bead-487b-b728-68a86a02118f\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.885136 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key\") pod \"9414071e-bead-487b-b728-68a86a02118f\" (UID: \"9414071e-bead-487b-b728-68a86a02118f\") " Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.890047 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n" (OuterVolumeSpecName: "kube-api-access-mwf7n") pod "9414071e-bead-487b-b728-68a86a02118f" (UID: "9414071e-bead-487b-b728-68a86a02118f"). InnerVolumeSpecName "kube-api-access-mwf7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.912394 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory" (OuterVolumeSpecName: "inventory") pod "9414071e-bead-487b-b728-68a86a02118f" (UID: "9414071e-bead-487b-b728-68a86a02118f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.928632 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9414071e-bead-487b-b728-68a86a02118f" (UID: "9414071e-bead-487b-b728-68a86a02118f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.987670 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.987904 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9414071e-bead-487b-b728-68a86a02118f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:01 crc kubenswrapper[4959]: I1007 13:28:01.987974 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwf7n\" (UniqueName: \"kubernetes.io/projected/9414071e-bead-487b-b728-68a86a02118f-kube-api-access-mwf7n\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:02 crc kubenswrapper[4959]: I1007 13:28:02.344792 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" event={"ID":"9414071e-bead-487b-b728-68a86a02118f","Type":"ContainerDied","Data":"f8fc044fd392297b43ae64b556e9cbfeb382cb8e338690b291d4d0f965ae93d2"} Oct 07 13:28:02 crc kubenswrapper[4959]: I1007 13:28:02.344852 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fc044fd392297b43ae64b556e9cbfeb382cb8e338690b291d4d0f965ae93d2" Oct 07 13:28:02 crc kubenswrapper[4959]: I1007 13:28:02.344897 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk" Oct 07 13:28:03 crc kubenswrapper[4959]: I1007 13:28:03.809587 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:28:03 crc kubenswrapper[4959]: E1007 13:28:03.810628 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.026986 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8"] Oct 07 13:28:09 crc kubenswrapper[4959]: E1007 13:28:09.030252 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9414071e-bead-487b-b728-68a86a02118f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.030280 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9414071e-bead-487b-b728-68a86a02118f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.030448 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9414071e-bead-487b-b728-68a86a02118f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.031058 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.033757 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.035207 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.035230 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.035207 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.046061 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8"] Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.229482 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8bx\" (UniqueName: \"kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.229548 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.229658 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.331775 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8bx\" (UniqueName: \"kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.331852 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.331923 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.337913 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.338160 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.357050 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8bx\" (UniqueName: \"kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:09 crc kubenswrapper[4959]: I1007 13:28:09.647183 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:10 crc kubenswrapper[4959]: I1007 13:28:10.132612 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8"] Oct 07 13:28:10 crc kubenswrapper[4959]: I1007 13:28:10.417663 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" event={"ID":"076a4a29-8705-457c-bfe4-0949f231be22","Type":"ContainerStarted","Data":"405f188d78c864c68d61051a9e4125db063368feac3b17577fcf1e997b0fb8ec"} Oct 07 13:28:11 crc kubenswrapper[4959]: I1007 13:28:11.430291 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" event={"ID":"076a4a29-8705-457c-bfe4-0949f231be22","Type":"ContainerStarted","Data":"cbf10ed9f1503a01d70a09e59749f70dcb3934230b05b3b12d48363a6f192e55"} Oct 07 13:28:11 crc kubenswrapper[4959]: I1007 13:28:11.452201 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" podStartSLOduration=1.980436928 podStartE2EDuration="2.452179875s" podCreationTimestamp="2025-10-07 13:28:09 +0000 UTC" firstStartedPulling="2025-10-07 13:28:10.138501868 +0000 UTC m=+1642.299224545" lastFinishedPulling="2025-10-07 13:28:10.610244775 +0000 UTC m=+1642.770967492" observedRunningTime="2025-10-07 13:28:11.447187865 +0000 UTC m=+1643.607910552" watchObservedRunningTime="2025-10-07 13:28:11.452179875 +0000 UTC m=+1643.612902552" Oct 07 13:28:13 crc kubenswrapper[4959]: I1007 13:28:13.040478 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-27bts"] Oct 07 13:28:13 crc kubenswrapper[4959]: I1007 13:28:13.051201 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9np9s"] Oct 07 13:28:13 crc kubenswrapper[4959]: I1007 13:28:13.058008 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9np9s"] Oct 07 13:28:13 crc kubenswrapper[4959]: I1007 13:28:13.064904 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-27bts"] Oct 07 13:28:14 crc kubenswrapper[4959]: I1007 13:28:14.029008 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rqlwl"] Oct 07 13:28:14 crc kubenswrapper[4959]: I1007 13:28:14.035696 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rqlwl"] Oct 07 13:28:14 crc kubenswrapper[4959]: I1007 13:28:14.820453 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65663a28-f7e8-430c-92e9-ba8e346b04ba" path="/var/lib/kubelet/pods/65663a28-f7e8-430c-92e9-ba8e346b04ba/volumes" Oct 07 13:28:14 crc kubenswrapper[4959]: I1007 13:28:14.821689 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87f137d-3e0f-423b-af71-2197ae7d9cf2" path="/var/lib/kubelet/pods/a87f137d-3e0f-423b-af71-2197ae7d9cf2/volumes" Oct 07 13:28:14 crc kubenswrapper[4959]: I1007 13:28:14.822853 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09add8f-d15e-47ea-83c3-8cd2512ae67a" path="/var/lib/kubelet/pods/e09add8f-d15e-47ea-83c3-8cd2512ae67a/volumes" Oct 07 13:28:17 crc kubenswrapper[4959]: I1007 13:28:17.808729 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:28:17 crc kubenswrapper[4959]: E1007 13:28:17.809465 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:28:23 crc kubenswrapper[4959]: I1007 13:28:23.030825 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8eca-account-create-ppk85"] Oct 07 13:28:23 crc kubenswrapper[4959]: I1007 13:28:23.037120 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8eca-account-create-ppk85"] Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.022844 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3a92-account-create-blh5r"] Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.032136 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b8a0-account-create-wdlvk"] Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.038828 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3a92-account-create-blh5r"] Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.045266 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b8a0-account-create-wdlvk"] Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.819877 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abd2ce0-24fb-4a3e-abce-ab6e4a693cde" path="/var/lib/kubelet/pods/1abd2ce0-24fb-4a3e-abce-ab6e4a693cde/volumes" Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.820555 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7" path="/var/lib/kubelet/pods/4143bfa4-c3ac-445e-be95-ef5dc2aa6fc7/volumes" Oct 07 13:28:24 crc kubenswrapper[4959]: I1007 13:28:24.821064 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c" path="/var/lib/kubelet/pods/a53ee4c4-47e3-4213-bbe8-6d2f92f28a4c/volumes" Oct 07 13:28:29 crc kubenswrapper[4959]: I1007 13:28:29.809109 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:28:29 crc kubenswrapper[4959]: E1007 13:28:29.810156 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:28:42 crc kubenswrapper[4959]: I1007 13:28:42.809834 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:28:42 crc kubenswrapper[4959]: E1007 13:28:42.811295 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:28:47 crc kubenswrapper[4959]: I1007 13:28:47.066441 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xphbd"] Oct 07 13:28:47 crc kubenswrapper[4959]: I1007 13:28:47.076400 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xphbd"] Oct 07 13:28:48 crc kubenswrapper[4959]: I1007 13:28:48.866424 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdeea2ce-cd7b-4608-8dc8-bab322fc76db" path="/var/lib/kubelet/pods/cdeea2ce-cd7b-4608-8dc8-bab322fc76db/volumes" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.545729 4959 scope.go:117] "RemoveContainer" containerID="b045ffcddbcdb73ecdef699772ca7c33fc1967b028c73276ea054786e510923c" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.580763 4959 scope.go:117] "RemoveContainer" containerID="8869094badb76c84646e0c5bfa5b6c1571df98ca907bfc24247c204c12a28896" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.636561 4959 scope.go:117] "RemoveContainer" containerID="19bc1701a6c81a9ee4bf8dce3abf860b68373c0e4a1cecbccb882db2a9b6a3f9" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.673053 4959 scope.go:117] "RemoveContainer" containerID="5c2721ebba6fc60a7a04d1082da2d836990e7c75a046a89409eb55ee38ce5e2b" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.713272 4959 scope.go:117] "RemoveContainer" containerID="0cc6a6b5660cbd3b85ab9a6de6de515f6e3526281b46c322674a33d2fb539da4" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.765671 4959 scope.go:117] "RemoveContainer" containerID="37ca17308a328c74235682edcbd7f343a853b9cde8f257b6b5232ec1c02d9adb" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.806506 4959 scope.go:117] "RemoveContainer" containerID="01acbc9e010509d954b70d866b7db920dd22a35b9badcd858e2f3b93f378d348" Oct 07 13:28:52 crc kubenswrapper[4959]: I1007 13:28:52.857505 4959 scope.go:117] "RemoveContainer" containerID="d3bf710636f8a743877ef42a6ba13f827b85d4ca2d916821a3add78c86983b6c" Oct 07 13:28:55 crc kubenswrapper[4959]: I1007 13:28:55.808969 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:28:55 crc kubenswrapper[4959]: E1007 13:28:55.809842 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:28:55 crc kubenswrapper[4959]: I1007 13:28:55.821277 4959 generic.go:334] "Generic (PLEG): container finished" podID="076a4a29-8705-457c-bfe4-0949f231be22" containerID="cbf10ed9f1503a01d70a09e59749f70dcb3934230b05b3b12d48363a6f192e55" exitCode=0 Oct 07 13:28:55 crc kubenswrapper[4959]: I1007 13:28:55.821341 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" event={"ID":"076a4a29-8705-457c-bfe4-0949f231be22","Type":"ContainerDied","Data":"cbf10ed9f1503a01d70a09e59749f70dcb3934230b05b3b12d48363a6f192e55"} Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.349372 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.470388 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx8bx\" (UniqueName: \"kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx\") pod \"076a4a29-8705-457c-bfe4-0949f231be22\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.470506 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key\") pod \"076a4a29-8705-457c-bfe4-0949f231be22\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.470641 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory\") pod \"076a4a29-8705-457c-bfe4-0949f231be22\" (UID: \"076a4a29-8705-457c-bfe4-0949f231be22\") " Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.475708 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx" (OuterVolumeSpecName: "kube-api-access-qx8bx") pod "076a4a29-8705-457c-bfe4-0949f231be22" (UID: "076a4a29-8705-457c-bfe4-0949f231be22"). InnerVolumeSpecName "kube-api-access-qx8bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.496189 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory" (OuterVolumeSpecName: "inventory") pod "076a4a29-8705-457c-bfe4-0949f231be22" (UID: "076a4a29-8705-457c-bfe4-0949f231be22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.496791 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "076a4a29-8705-457c-bfe4-0949f231be22" (UID: "076a4a29-8705-457c-bfe4-0949f231be22"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.573243 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx8bx\" (UniqueName: \"kubernetes.io/projected/076a4a29-8705-457c-bfe4-0949f231be22-kube-api-access-qx8bx\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.573415 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.573498 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076a4a29-8705-457c-bfe4-0949f231be22-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.840143 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" event={"ID":"076a4a29-8705-457c-bfe4-0949f231be22","Type":"ContainerDied","Data":"405f188d78c864c68d61051a9e4125db063368feac3b17577fcf1e997b0fb8ec"} Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.840242 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405f188d78c864c68d61051a9e4125db063368feac3b17577fcf1e997b0fb8ec" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.840253 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.939565 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxbwq"] Oct 07 13:28:57 crc kubenswrapper[4959]: E1007 13:28:57.940391 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076a4a29-8705-457c-bfe4-0949f231be22" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.940416 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="076a4a29-8705-457c-bfe4-0949f231be22" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.940689 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="076a4a29-8705-457c-bfe4-0949f231be22" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.941393 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.944138 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.944422 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.944697 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.944939 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:28:57 crc kubenswrapper[4959]: I1007 13:28:57.964055 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxbwq"] Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.083793 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rf2p\" (UniqueName: \"kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.083863 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.084005 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.185126 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rf2p\" (UniqueName: \"kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.185223 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.185431 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.196489 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.196556 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.202216 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rf2p\" (UniqueName: \"kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p\") pod \"ssh-known-hosts-edpm-deployment-kxbwq\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.260812 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.786409 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxbwq"] Oct 07 13:28:58 crc kubenswrapper[4959]: I1007 13:28:58.858687 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" event={"ID":"02827e48-f5f8-49d2-8442-8e59f8fc1395","Type":"ContainerStarted","Data":"155d0fc572f5b07d4c17186121158c7ad8138e139a4ae9dff90a37bea99abd51"} Oct 07 13:29:00 crc kubenswrapper[4959]: I1007 13:29:00.874964 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" event={"ID":"02827e48-f5f8-49d2-8442-8e59f8fc1395","Type":"ContainerStarted","Data":"b69b14929ea7ddc91ae43a3298eeed4d06333baa0348a4b99c72946637cccf8c"} Oct 07 13:29:00 crc kubenswrapper[4959]: I1007 13:29:00.899714 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" podStartSLOduration=2.983314398 podStartE2EDuration="3.899697037s" podCreationTimestamp="2025-10-07 13:28:57 +0000 UTC" firstStartedPulling="2025-10-07 13:28:58.794325644 +0000 UTC m=+1690.955048321" lastFinishedPulling="2025-10-07 13:28:59.710708263 +0000 UTC m=+1691.871430960" observedRunningTime="2025-10-07 13:29:00.891280109 +0000 UTC m=+1693.052002786" watchObservedRunningTime="2025-10-07 13:29:00.899697037 +0000 UTC m=+1693.060419714" Oct 07 13:29:07 crc kubenswrapper[4959]: I1007 13:29:07.941971 4959 generic.go:334] "Generic (PLEG): container finished" podID="02827e48-f5f8-49d2-8442-8e59f8fc1395" containerID="b69b14929ea7ddc91ae43a3298eeed4d06333baa0348a4b99c72946637cccf8c" exitCode=0 Oct 07 13:29:07 crc kubenswrapper[4959]: I1007 13:29:07.942025 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" event={"ID":"02827e48-f5f8-49d2-8442-8e59f8fc1395","Type":"ContainerDied","Data":"b69b14929ea7ddc91ae43a3298eeed4d06333baa0348a4b99c72946637cccf8c"} Oct 07 13:29:08 crc kubenswrapper[4959]: I1007 13:29:08.814985 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:29:08 crc kubenswrapper[4959]: E1007 13:29:08.816292 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.047290 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-l98vg"] Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.054046 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9g4tj"] Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.062471 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9g4tj"] Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.069090 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-l98vg"] Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.426603 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.514657 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam\") pod \"02827e48-f5f8-49d2-8442-8e59f8fc1395\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.514832 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rf2p\" (UniqueName: \"kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p\") pod \"02827e48-f5f8-49d2-8442-8e59f8fc1395\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.514894 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0\") pod \"02827e48-f5f8-49d2-8442-8e59f8fc1395\" (UID: \"02827e48-f5f8-49d2-8442-8e59f8fc1395\") " Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.524840 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p" (OuterVolumeSpecName: "kube-api-access-9rf2p") pod "02827e48-f5f8-49d2-8442-8e59f8fc1395" (UID: "02827e48-f5f8-49d2-8442-8e59f8fc1395"). InnerVolumeSpecName "kube-api-access-9rf2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.542193 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02827e48-f5f8-49d2-8442-8e59f8fc1395" (UID: "02827e48-f5f8-49d2-8442-8e59f8fc1395"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.557072 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "02827e48-f5f8-49d2-8442-8e59f8fc1395" (UID: "02827e48-f5f8-49d2-8442-8e59f8fc1395"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.616482 4959 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.616515 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02827e48-f5f8-49d2-8442-8e59f8fc1395-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.616526 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rf2p\" (UniqueName: \"kubernetes.io/projected/02827e48-f5f8-49d2-8442-8e59f8fc1395-kube-api-access-9rf2p\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.963084 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" event={"ID":"02827e48-f5f8-49d2-8442-8e59f8fc1395","Type":"ContainerDied","Data":"155d0fc572f5b07d4c17186121158c7ad8138e139a4ae9dff90a37bea99abd51"} Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.963137 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="155d0fc572f5b07d4c17186121158c7ad8138e139a4ae9dff90a37bea99abd51" Oct 07 13:29:09 crc kubenswrapper[4959]: I1007 13:29:09.963219 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxbwq" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.032343 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f"] Oct 07 13:29:10 crc kubenswrapper[4959]: E1007 13:29:10.032772 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02827e48-f5f8-49d2-8442-8e59f8fc1395" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.032791 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="02827e48-f5f8-49d2-8442-8e59f8fc1395" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.033059 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="02827e48-f5f8-49d2-8442-8e59f8fc1395" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.033861 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.035707 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.037338 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.037909 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.038111 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.063049 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f"] Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.145009 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjmr\" (UniqueName: \"kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.145169 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.145282 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.246978 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.247169 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjmr\" (UniqueName: \"kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.247291 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.251897 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.253928 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.270039 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjmr\" (UniqueName: \"kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c2z8f\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.363830 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.821620 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dde621-534c-4f39-ab29-baa7401101a8" path="/var/lib/kubelet/pods/41dde621-534c-4f39-ab29-baa7401101a8/volumes" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.823137 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b53f390-b59f-45fd-8ec7-e405e011f07d" path="/var/lib/kubelet/pods/8b53f390-b59f-45fd-8ec7-e405e011f07d/volumes" Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.873715 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f"] Oct 07 13:29:10 crc kubenswrapper[4959]: I1007 13:29:10.971277 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" event={"ID":"7db4250e-518b-4083-8fa6-537615e53340","Type":"ContainerStarted","Data":"e22a7db3f8e1cd1590b0e8dce1b4e3573c478c8d5fe50ce7c2fe3d0c6dcb7934"} Oct 07 13:29:11 crc kubenswrapper[4959]: I1007 13:29:11.981609 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" event={"ID":"7db4250e-518b-4083-8fa6-537615e53340","Type":"ContainerStarted","Data":"859a1f1699d4d70ac2543998c1dee8c5f536514e5364ee1d785901df3052f839"} Oct 07 13:29:12 crc kubenswrapper[4959]: I1007 13:29:12.004564 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" podStartSLOduration=1.590472628 podStartE2EDuration="2.004546749s" podCreationTimestamp="2025-10-07 13:29:10 +0000 UTC" firstStartedPulling="2025-10-07 13:29:10.873989374 +0000 UTC m=+1703.034712051" lastFinishedPulling="2025-10-07 13:29:11.288063485 +0000 UTC m=+1703.448786172" observedRunningTime="2025-10-07 13:29:12.001067941 +0000 UTC m=+1704.161790668" watchObservedRunningTime="2025-10-07 13:29:12.004546749 +0000 UTC m=+1704.165269426" Oct 07 13:29:20 crc kubenswrapper[4959]: I1007 13:29:20.077348 4959 generic.go:334] "Generic (PLEG): container finished" podID="7db4250e-518b-4083-8fa6-537615e53340" containerID="859a1f1699d4d70ac2543998c1dee8c5f536514e5364ee1d785901df3052f839" exitCode=0 Oct 07 13:29:20 crc kubenswrapper[4959]: I1007 13:29:20.077451 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" event={"ID":"7db4250e-518b-4083-8fa6-537615e53340","Type":"ContainerDied","Data":"859a1f1699d4d70ac2543998c1dee8c5f536514e5364ee1d785901df3052f839"} Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.491124 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.569247 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key\") pod \"7db4250e-518b-4083-8fa6-537615e53340\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.569393 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvjmr\" (UniqueName: \"kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr\") pod \"7db4250e-518b-4083-8fa6-537615e53340\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.569573 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory\") pod \"7db4250e-518b-4083-8fa6-537615e53340\" (UID: \"7db4250e-518b-4083-8fa6-537615e53340\") " Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.574739 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr" (OuterVolumeSpecName: "kube-api-access-xvjmr") pod "7db4250e-518b-4083-8fa6-537615e53340" (UID: "7db4250e-518b-4083-8fa6-537615e53340"). InnerVolumeSpecName "kube-api-access-xvjmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.596159 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory" (OuterVolumeSpecName: "inventory") pod "7db4250e-518b-4083-8fa6-537615e53340" (UID: "7db4250e-518b-4083-8fa6-537615e53340"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.611832 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7db4250e-518b-4083-8fa6-537615e53340" (UID: "7db4250e-518b-4083-8fa6-537615e53340"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.671330 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.671364 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db4250e-518b-4083-8fa6-537615e53340-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:21 crc kubenswrapper[4959]: I1007 13:29:21.671376 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvjmr\" (UniqueName: \"kubernetes.io/projected/7db4250e-518b-4083-8fa6-537615e53340-kube-api-access-xvjmr\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.097293 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" event={"ID":"7db4250e-518b-4083-8fa6-537615e53340","Type":"ContainerDied","Data":"e22a7db3f8e1cd1590b0e8dce1b4e3573c478c8d5fe50ce7c2fe3d0c6dcb7934"} Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.097334 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22a7db3f8e1cd1590b0e8dce1b4e3573c478c8d5fe50ce7c2fe3d0c6dcb7934" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.097357 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.164543 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w"] Oct 07 13:29:22 crc kubenswrapper[4959]: E1007 13:29:22.165144 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4250e-518b-4083-8fa6-537615e53340" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.165226 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4250e-518b-4083-8fa6-537615e53340" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.165463 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4250e-518b-4083-8fa6-537615e53340" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.166131 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.168753 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.169514 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.170477 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.170733 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.184212 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w"] Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.285497 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.285588 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.285719 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjz5z\" (UniqueName: \"kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.386878 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.386967 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.387053 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjz5z\" (UniqueName: \"kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.390728 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.392311 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.409070 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjz5z\" (UniqueName: \"kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.487800 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:22 crc kubenswrapper[4959]: I1007 13:29:22.996245 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w"] Oct 07 13:29:23 crc kubenswrapper[4959]: I1007 13:29:23.109402 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" event={"ID":"f91419b4-1b79-4949-a479-2b64fa725b36","Type":"ContainerStarted","Data":"a367b589ac3829da24399e4f7e102be87bd981f1d17b89b43cbfc48e792a2678"} Oct 07 13:29:23 crc kubenswrapper[4959]: I1007 13:29:23.809412 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:29:23 crc kubenswrapper[4959]: E1007 13:29:23.810003 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:29:24 crc kubenswrapper[4959]: I1007 13:29:24.119958 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" event={"ID":"f91419b4-1b79-4949-a479-2b64fa725b36","Type":"ContainerStarted","Data":"028feddaa85176da70d71bba1ecd8f32911fe676a12b1e045cab3cc2f8703659"} Oct 07 13:29:24 crc kubenswrapper[4959]: I1007 13:29:24.142900 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" podStartSLOduration=1.646284716 podStartE2EDuration="2.142871203s" podCreationTimestamp="2025-10-07 13:29:22 +0000 UTC" firstStartedPulling="2025-10-07 13:29:22.999120475 +0000 UTC m=+1715.159843142" lastFinishedPulling="2025-10-07 13:29:23.495706902 +0000 UTC m=+1715.656429629" observedRunningTime="2025-10-07 13:29:24.138016307 +0000 UTC m=+1716.298739004" watchObservedRunningTime="2025-10-07 13:29:24.142871203 +0000 UTC m=+1716.303593880" Oct 07 13:29:33 crc kubenswrapper[4959]: I1007 13:29:33.199708 4959 generic.go:334] "Generic (PLEG): container finished" podID="f91419b4-1b79-4949-a479-2b64fa725b36" containerID="028feddaa85176da70d71bba1ecd8f32911fe676a12b1e045cab3cc2f8703659" exitCode=0 Oct 07 13:29:33 crc kubenswrapper[4959]: I1007 13:29:33.199799 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" event={"ID":"f91419b4-1b79-4949-a479-2b64fa725b36","Type":"ContainerDied","Data":"028feddaa85176da70d71bba1ecd8f32911fe676a12b1e045cab3cc2f8703659"} Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.600149 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.695717 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjz5z\" (UniqueName: \"kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z\") pod \"f91419b4-1b79-4949-a479-2b64fa725b36\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.695787 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key\") pod \"f91419b4-1b79-4949-a479-2b64fa725b36\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.695836 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory\") pod \"f91419b4-1b79-4949-a479-2b64fa725b36\" (UID: \"f91419b4-1b79-4949-a479-2b64fa725b36\") " Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.704520 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z" (OuterVolumeSpecName: "kube-api-access-xjz5z") pod "f91419b4-1b79-4949-a479-2b64fa725b36" (UID: "f91419b4-1b79-4949-a479-2b64fa725b36"). InnerVolumeSpecName "kube-api-access-xjz5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.723807 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f91419b4-1b79-4949-a479-2b64fa725b36" (UID: "f91419b4-1b79-4949-a479-2b64fa725b36"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.754149 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory" (OuterVolumeSpecName: "inventory") pod "f91419b4-1b79-4949-a479-2b64fa725b36" (UID: "f91419b4-1b79-4949-a479-2b64fa725b36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.798691 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.798730 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f91419b4-1b79-4949-a479-2b64fa725b36-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.798744 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjz5z\" (UniqueName: \"kubernetes.io/projected/f91419b4-1b79-4949-a479-2b64fa725b36-kube-api-access-xjz5z\") on node \"crc\" DevicePath \"\"" Oct 07 13:29:34 crc kubenswrapper[4959]: I1007 13:29:34.808727 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:29:34 crc kubenswrapper[4959]: E1007 13:29:34.809135 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:29:35 crc kubenswrapper[4959]: I1007 13:29:35.217399 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" event={"ID":"f91419b4-1b79-4949-a479-2b64fa725b36","Type":"ContainerDied","Data":"a367b589ac3829da24399e4f7e102be87bd981f1d17b89b43cbfc48e792a2678"} Oct 07 13:29:35 crc kubenswrapper[4959]: I1007 13:29:35.217740 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a367b589ac3829da24399e4f7e102be87bd981f1d17b89b43cbfc48e792a2678" Oct 07 13:29:35 crc kubenswrapper[4959]: I1007 13:29:35.217475 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w" Oct 07 13:29:47 crc kubenswrapper[4959]: I1007 13:29:47.809136 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:29:47 crc kubenswrapper[4959]: E1007 13:29:47.810244 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:29:53 crc kubenswrapper[4959]: I1007 13:29:53.026144 4959 scope.go:117] "RemoveContainer" containerID="3d549af519014fd957043cd75c7f5cc9ef727243f01d35f9d72e53fea55d5529" Oct 07 13:29:53 crc kubenswrapper[4959]: I1007 13:29:53.083802 4959 scope.go:117] "RemoveContainer" containerID="45990344e2031840c1955601b359ba30a984e300f695da70e737a9e39b96fedd" Oct 07 13:29:54 crc kubenswrapper[4959]: I1007 13:29:54.046801 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-clpnc"] Oct 07 13:29:54 crc kubenswrapper[4959]: I1007 13:29:54.053112 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-clpnc"] Oct 07 13:29:54 crc kubenswrapper[4959]: I1007 13:29:54.830360 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b13bdd1-f729-4a5a-bef3-4587cdac360f" path="/var/lib/kubelet/pods/4b13bdd1-f729-4a5a-bef3-4587cdac360f/volumes" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.154548 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt"] Oct 07 13:30:00 crc kubenswrapper[4959]: E1007 13:30:00.155148 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91419b4-1b79-4949-a479-2b64fa725b36" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.155161 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91419b4-1b79-4949-a479-2b64fa725b36" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.155316 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91419b4-1b79-4949-a479-2b64fa725b36" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.155919 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.157976 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.158165 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.160592 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt"] Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.258834 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.258999 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4j2\" (UniqueName: \"kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.259030 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.361097 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.361334 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4j2\" (UniqueName: \"kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.361359 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.362077 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.381643 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.385324 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4j2\" (UniqueName: \"kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2\") pod \"collect-profiles-29330730-sh2jt\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.475249 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:00 crc kubenswrapper[4959]: I1007 13:30:00.900901 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt"] Oct 07 13:30:00 crc kubenswrapper[4959]: W1007 13:30:00.901812 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931b10c1_afe8_4537_8dee_4581dfd7ae27.slice/crio-06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703 WatchSource:0}: Error finding container 06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703: Status 404 returned error can't find the container with id 06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703 Oct 07 13:30:01 crc kubenswrapper[4959]: I1007 13:30:01.473210 4959 generic.go:334] "Generic (PLEG): container finished" podID="931b10c1-afe8-4537-8dee-4581dfd7ae27" containerID="dd12d9871942dfc0c11254ef6b6c80b499d8e17730395f2bfc9b362d50eec73d" exitCode=0 Oct 07 13:30:01 crc kubenswrapper[4959]: I1007 13:30:01.474180 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" event={"ID":"931b10c1-afe8-4537-8dee-4581dfd7ae27","Type":"ContainerDied","Data":"dd12d9871942dfc0c11254ef6b6c80b499d8e17730395f2bfc9b362d50eec73d"} Oct 07 13:30:01 crc kubenswrapper[4959]: I1007 13:30:01.474294 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" event={"ID":"931b10c1-afe8-4537-8dee-4581dfd7ae27","Type":"ContainerStarted","Data":"06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703"} Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.809485 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:30:02 crc kubenswrapper[4959]: E1007 13:30:02.810221 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.852054 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.908233 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume\") pod \"931b10c1-afe8-4537-8dee-4581dfd7ae27\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.908404 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4j2\" (UniqueName: \"kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2\") pod \"931b10c1-afe8-4537-8dee-4581dfd7ae27\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.908561 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume\") pod \"931b10c1-afe8-4537-8dee-4581dfd7ae27\" (UID: \"931b10c1-afe8-4537-8dee-4581dfd7ae27\") " Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.909479 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume" (OuterVolumeSpecName: "config-volume") pod "931b10c1-afe8-4537-8dee-4581dfd7ae27" (UID: "931b10c1-afe8-4537-8dee-4581dfd7ae27"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.913987 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "931b10c1-afe8-4537-8dee-4581dfd7ae27" (UID: "931b10c1-afe8-4537-8dee-4581dfd7ae27"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:30:02 crc kubenswrapper[4959]: I1007 13:30:02.915977 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2" (OuterVolumeSpecName: "kube-api-access-pt4j2") pod "931b10c1-afe8-4537-8dee-4581dfd7ae27" (UID: "931b10c1-afe8-4537-8dee-4581dfd7ae27"). InnerVolumeSpecName "kube-api-access-pt4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.010207 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/931b10c1-afe8-4537-8dee-4581dfd7ae27-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.010245 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/931b10c1-afe8-4537-8dee-4581dfd7ae27-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.010256 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4j2\" (UniqueName: \"kubernetes.io/projected/931b10c1-afe8-4537-8dee-4581dfd7ae27-kube-api-access-pt4j2\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.494474 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" event={"ID":"931b10c1-afe8-4537-8dee-4581dfd7ae27","Type":"ContainerDied","Data":"06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703"} Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.495066 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06cfb73c5f3d9bd735c239cf8866f193a1ff5b0557c62bfe9372171921430703" Oct 07 13:30:03 crc kubenswrapper[4959]: I1007 13:30:03.494800 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt" Oct 07 13:30:14 crc kubenswrapper[4959]: I1007 13:30:14.809375 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:30:14 crc kubenswrapper[4959]: E1007 13:30:14.810246 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:30:28 crc kubenswrapper[4959]: I1007 13:30:28.838056 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:30:28 crc kubenswrapper[4959]: E1007 13:30:28.839296 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:30:42 crc kubenswrapper[4959]: I1007 13:30:42.808915 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:30:42 crc kubenswrapper[4959]: E1007 13:30:42.810004 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:30:53 crc kubenswrapper[4959]: I1007 13:30:53.210973 4959 scope.go:117] "RemoveContainer" containerID="6566960a178f02f7dd39576384b50412d61e11bffcdd9cf1d8bf580a02c87d59" Oct 07 13:30:56 crc kubenswrapper[4959]: I1007 13:30:56.809113 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:30:56 crc kubenswrapper[4959]: E1007 13:30:56.809603 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:31:11 crc kubenswrapper[4959]: I1007 13:31:11.808790 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:31:11 crc kubenswrapper[4959]: E1007 13:31:11.810591 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:31:25 crc kubenswrapper[4959]: I1007 13:31:25.809060 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:31:25 crc kubenswrapper[4959]: E1007 13:31:25.810091 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:31:40 crc kubenswrapper[4959]: I1007 13:31:40.809771 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:31:40 crc kubenswrapper[4959]: E1007 13:31:40.820878 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:31:54 crc kubenswrapper[4959]: I1007 13:31:54.809500 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:31:54 crc kubenswrapper[4959]: E1007 13:31:54.811150 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:32:09 crc kubenswrapper[4959]: I1007 13:32:09.808952 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:32:09 crc kubenswrapper[4959]: E1007 13:32:09.809783 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:32:20 crc kubenswrapper[4959]: I1007 13:32:20.809498 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:32:20 crc kubenswrapper[4959]: E1007 13:32:20.810346 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:32:35 crc kubenswrapper[4959]: I1007 13:32:35.808846 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:32:35 crc kubenswrapper[4959]: E1007 13:32:35.809718 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:32:48 crc kubenswrapper[4959]: I1007 13:32:48.816141 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:32:49 crc kubenswrapper[4959]: I1007 13:32:49.876833 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353"} Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.035666 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.045822 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8sxqx"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.053328 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.060320 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.067182 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.072870 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.078456 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.083666 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxbwq"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.089195 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gvz7g"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.094805 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hg6n8"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.100246 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxbwq"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.105614 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.111376 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgtzf"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.117433 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.124268 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.129700 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x94st"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.134845 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dp77w"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.139832 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c2z8f"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.145062 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.150397 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-v85f8"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.171933 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-v85ck"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.177700 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d47kk"] Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.819613 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02827e48-f5f8-49d2-8442-8e59f8fc1395" path="/var/lib/kubelet/pods/02827e48-f5f8-49d2-8442-8e59f8fc1395/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.820422 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076a4a29-8705-457c-bfe4-0949f231be22" path="/var/lib/kubelet/pods/076a4a29-8705-457c-bfe4-0949f231be22/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.821017 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ccca019-7eaf-4648-89f0-10795926e8c4" path="/var/lib/kubelet/pods/1ccca019-7eaf-4648-89f0-10795926e8c4/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.821637 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e3d7e0-dbba-4eb6-ac01-885b020435ee" path="/var/lib/kubelet/pods/23e3d7e0-dbba-4eb6-ac01-885b020435ee/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.822662 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36002010-873a-43db-bc92-b37c7eb7bf35" path="/var/lib/kubelet/pods/36002010-873a-43db-bc92-b37c7eb7bf35/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.823262 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5518b9cd-5bcc-480a-95ef-6aa41f1b6745" path="/var/lib/kubelet/pods/5518b9cd-5bcc-480a-95ef-6aa41f1b6745/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.823848 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e8bfe37-7a34-43c8-9a53-d0a03f45b382" path="/var/lib/kubelet/pods/6e8bfe37-7a34-43c8-9a53-d0a03f45b382/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.824929 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db4250e-518b-4083-8fa6-537615e53340" path="/var/lib/kubelet/pods/7db4250e-518b-4083-8fa6-537615e53340/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.825499 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9414071e-bead-487b-b728-68a86a02118f" path="/var/lib/kubelet/pods/9414071e-bead-487b-b728-68a86a02118f/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.826080 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6c15c4-cb23-4875-83e9-2e5f2669d56a" path="/var/lib/kubelet/pods/9b6c15c4-cb23-4875-83e9-2e5f2669d56a/volumes" Oct 07 13:34:20 crc kubenswrapper[4959]: I1007 13:34:20.827121 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91419b4-1b79-4949-a479-2b64fa725b36" path="/var/lib/kubelet/pods/f91419b4-1b79-4949-a479-2b64fa725b36/volumes" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.944900 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs"] Oct 07 13:34:25 crc kubenswrapper[4959]: E1007 13:34:25.945892 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931b10c1-afe8-4537-8dee-4581dfd7ae27" containerName="collect-profiles" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.945908 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="931b10c1-afe8-4537-8dee-4581dfd7ae27" containerName="collect-profiles" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.946144 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="931b10c1-afe8-4537-8dee-4581dfd7ae27" containerName="collect-profiles" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.947969 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.950742 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.951012 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.951180 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.951226 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.951469 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:34:25 crc kubenswrapper[4959]: I1007 13:34:25.967775 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs"] Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.052752 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.052828 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.052875 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gxg\" (UniqueName: \"kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.052952 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.052977 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.155031 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.155101 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.155143 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gxg\" (UniqueName: \"kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.155213 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.155234 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.163399 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.163913 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.164503 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.165910 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.178616 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gxg\" (UniqueName: \"kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.273044 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.827292 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs"] Oct 07 13:34:26 crc kubenswrapper[4959]: I1007 13:34:26.833719 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:34:27 crc kubenswrapper[4959]: I1007 13:34:27.730514 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" event={"ID":"1522ab05-1ecc-4aad-8196-557397dd2ebf","Type":"ContainerStarted","Data":"0cd68ddceb9c9451410efb21c7b504ec678f1b3dc34e9024a37b6c99b7bac966"} Oct 07 13:34:27 crc kubenswrapper[4959]: I1007 13:34:27.731087 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" event={"ID":"1522ab05-1ecc-4aad-8196-557397dd2ebf","Type":"ContainerStarted","Data":"a7e2c96910e27232698298b5aef746649fa644cabeaf1370d03e52c209eb4c17"} Oct 07 13:34:27 crc kubenswrapper[4959]: I1007 13:34:27.745798 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" podStartSLOduration=2.191032678 podStartE2EDuration="2.74578325s" podCreationTimestamp="2025-10-07 13:34:25 +0000 UTC" firstStartedPulling="2025-10-07 13:34:26.831818836 +0000 UTC m=+2018.992541513" lastFinishedPulling="2025-10-07 13:34:27.386569408 +0000 UTC m=+2019.547292085" observedRunningTime="2025-10-07 13:34:27.743671889 +0000 UTC m=+2019.904394576" watchObservedRunningTime="2025-10-07 13:34:27.74578325 +0000 UTC m=+2019.906505927" Oct 07 13:34:38 crc kubenswrapper[4959]: I1007 13:34:38.840229 4959 generic.go:334] "Generic (PLEG): container finished" podID="1522ab05-1ecc-4aad-8196-557397dd2ebf" containerID="0cd68ddceb9c9451410efb21c7b504ec678f1b3dc34e9024a37b6c99b7bac966" exitCode=0 Oct 07 13:34:38 crc kubenswrapper[4959]: I1007 13:34:38.840414 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" event={"ID":"1522ab05-1ecc-4aad-8196-557397dd2ebf","Type":"ContainerDied","Data":"0cd68ddceb9c9451410efb21c7b504ec678f1b3dc34e9024a37b6c99b7bac966"} Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.345706 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.525581 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key\") pod \"1522ab05-1ecc-4aad-8196-557397dd2ebf\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.525650 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory\") pod \"1522ab05-1ecc-4aad-8196-557397dd2ebf\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.525735 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph\") pod \"1522ab05-1ecc-4aad-8196-557397dd2ebf\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.525756 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5gxg\" (UniqueName: \"kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg\") pod \"1522ab05-1ecc-4aad-8196-557397dd2ebf\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.525909 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle\") pod \"1522ab05-1ecc-4aad-8196-557397dd2ebf\" (UID: \"1522ab05-1ecc-4aad-8196-557397dd2ebf\") " Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.533195 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg" (OuterVolumeSpecName: "kube-api-access-j5gxg") pod "1522ab05-1ecc-4aad-8196-557397dd2ebf" (UID: "1522ab05-1ecc-4aad-8196-557397dd2ebf"). InnerVolumeSpecName "kube-api-access-j5gxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.536093 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1522ab05-1ecc-4aad-8196-557397dd2ebf" (UID: "1522ab05-1ecc-4aad-8196-557397dd2ebf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.541786 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph" (OuterVolumeSpecName: "ceph") pod "1522ab05-1ecc-4aad-8196-557397dd2ebf" (UID: "1522ab05-1ecc-4aad-8196-557397dd2ebf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.557600 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory" (OuterVolumeSpecName: "inventory") pod "1522ab05-1ecc-4aad-8196-557397dd2ebf" (UID: "1522ab05-1ecc-4aad-8196-557397dd2ebf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.560400 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1522ab05-1ecc-4aad-8196-557397dd2ebf" (UID: "1522ab05-1ecc-4aad-8196-557397dd2ebf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.628276 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.628315 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5gxg\" (UniqueName: \"kubernetes.io/projected/1522ab05-1ecc-4aad-8196-557397dd2ebf-kube-api-access-j5gxg\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.628330 4959 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.628342 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.628356 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1522ab05-1ecc-4aad-8196-557397dd2ebf-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.860707 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" event={"ID":"1522ab05-1ecc-4aad-8196-557397dd2ebf","Type":"ContainerDied","Data":"a7e2c96910e27232698298b5aef746649fa644cabeaf1370d03e52c209eb4c17"} Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.860750 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e2c96910e27232698298b5aef746649fa644cabeaf1370d03e52c209eb4c17" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.860810 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.941142 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz"] Oct 07 13:34:40 crc kubenswrapper[4959]: E1007 13:34:40.941680 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1522ab05-1ecc-4aad-8196-557397dd2ebf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.941699 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1522ab05-1ecc-4aad-8196-557397dd2ebf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.941960 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1522ab05-1ecc-4aad-8196-557397dd2ebf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.942744 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.944981 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.945093 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.946145 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.946222 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.946145 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:34:40 crc kubenswrapper[4959]: I1007 13:34:40.951481 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz"] Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.135116 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.135541 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmc7p\" (UniqueName: \"kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.135723 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.135836 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.135918 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.237612 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmc7p\" (UniqueName: \"kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.237735 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.237772 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.237796 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.237820 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.241645 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.241945 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.250035 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.250974 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.254595 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmc7p\" (UniqueName: \"kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.266160 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.751729 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz"] Oct 07 13:34:41 crc kubenswrapper[4959]: I1007 13:34:41.868276 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" event={"ID":"7be1a560-abc0-4b57-a960-85019afbe322","Type":"ContainerStarted","Data":"ae4fe4bece357b4e38c05e8d3421b2170a3e0963daba99db90297ff76d924476"} Oct 07 13:34:42 crc kubenswrapper[4959]: I1007 13:34:42.899573 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" event={"ID":"7be1a560-abc0-4b57-a960-85019afbe322","Type":"ContainerStarted","Data":"808954bd6f8cb0785eaac928f30d98e9d24a8a6411ab2515491c05c766975a8d"} Oct 07 13:34:42 crc kubenswrapper[4959]: I1007 13:34:42.927655 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" podStartSLOduration=2.308357465 podStartE2EDuration="2.927608354s" podCreationTimestamp="2025-10-07 13:34:40 +0000 UTC" firstStartedPulling="2025-10-07 13:34:41.757715691 +0000 UTC m=+2033.918438368" lastFinishedPulling="2025-10-07 13:34:42.37696657 +0000 UTC m=+2034.537689257" observedRunningTime="2025-10-07 13:34:42.919056088 +0000 UTC m=+2035.079778795" watchObservedRunningTime="2025-10-07 13:34:42.927608354 +0000 UTC m=+2035.088331041" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.089667 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.121222 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.122996 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.223595 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.223677 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.223798 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gfk\" (UniqueName: \"kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.325192 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.325271 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.325381 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gfk\" (UniqueName: \"kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.325982 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.326081 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.357577 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gfk\" (UniqueName: \"kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk\") pod \"certified-operators-vgfrp\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.453387 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:34:51 crc kubenswrapper[4959]: I1007 13:34:51.985675 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:34:52 crc kubenswrapper[4959]: I1007 13:34:52.005234 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerStarted","Data":"16fab1373d70e30eb112ceea4b62d3f4bf9096febf0060ff17596cb494b284b8"} Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.014551 4959 generic.go:334] "Generic (PLEG): container finished" podID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerID="24c973079f441fd8184f0bd15b6f34f0484e6a655bba75f3db0e9f616e7495f6" exitCode=0 Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.014672 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerDied","Data":"24c973079f441fd8184f0bd15b6f34f0484e6a655bba75f3db0e9f616e7495f6"} Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.346903 4959 scope.go:117] "RemoveContainer" containerID="cbf10ed9f1503a01d70a09e59749f70dcb3934230b05b3b12d48363a6f192e55" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.385508 4959 scope.go:117] "RemoveContainer" containerID="7bc0ba797e4bfb7465f2b0445fe04985722c6d7767d548ecb260f834bf6f8bc3" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.429045 4959 scope.go:117] "RemoveContainer" containerID="9793afd8210076f79a6f5af7970531b47b268d36e0fa1cd2852fb0dbe205a26f" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.456922 4959 scope.go:117] "RemoveContainer" containerID="71f031dfb4d144077375134402fdd0f36301cc31b3de43111de19a5b0598224b" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.495985 4959 scope.go:117] "RemoveContainer" containerID="1fa5c56c06fa7f6ce291c46fafd5542c6ad8e1ffc3dbcde1553096c841416829" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.554607 4959 scope.go:117] "RemoveContainer" containerID="73b81f768b5da76e100442f7d2d698de5c45908e61f9fb85ec64bbf5596d7c3a" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.597321 4959 scope.go:117] "RemoveContainer" containerID="eb58a85a48861919ebc71e41aedeff86cd9a3f1465e2684e669ecf0d3d662a0b" Oct 07 13:34:53 crc kubenswrapper[4959]: I1007 13:34:53.625096 4959 scope.go:117] "RemoveContainer" containerID="318bf4cd9f07d946f6ecdbd31b67b0cd55aed3b1df3b2f5ad6e35b6b1b1ba10f" Oct 07 13:34:55 crc kubenswrapper[4959]: I1007 13:34:55.036387 4959 generic.go:334] "Generic (PLEG): container finished" podID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerID="0570ec4e6c0f12915df1e85237920ef6791913c71b18fd2e15c3c4f18de09720" exitCode=0 Oct 07 13:34:55 crc kubenswrapper[4959]: I1007 13:34:55.036500 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerDied","Data":"0570ec4e6c0f12915df1e85237920ef6791913c71b18fd2e15c3c4f18de09720"} Oct 07 13:34:56 crc kubenswrapper[4959]: I1007 13:34:56.047996 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerStarted","Data":"602eb0d7980e3041fe94e529c45afb8e2f4cf8ca2b3658556aae6bf943e33ce4"} Oct 07 13:34:56 crc kubenswrapper[4959]: I1007 13:34:56.065723 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgfrp" podStartSLOduration=2.583281515 podStartE2EDuration="5.065707699s" podCreationTimestamp="2025-10-07 13:34:51 +0000 UTC" firstStartedPulling="2025-10-07 13:34:53.01624096 +0000 UTC m=+2045.176963647" lastFinishedPulling="2025-10-07 13:34:55.498667154 +0000 UTC m=+2047.659389831" observedRunningTime="2025-10-07 13:34:56.065237215 +0000 UTC m=+2048.225959902" watchObservedRunningTime="2025-10-07 13:34:56.065707699 +0000 UTC m=+2048.226430376" Oct 07 13:35:01 crc kubenswrapper[4959]: I1007 13:35:01.455165 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:01 crc kubenswrapper[4959]: I1007 13:35:01.456438 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:01 crc kubenswrapper[4959]: I1007 13:35:01.509950 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:02 crc kubenswrapper[4959]: I1007 13:35:02.165075 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:02 crc kubenswrapper[4959]: I1007 13:35:02.213395 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:35:04 crc kubenswrapper[4959]: I1007 13:35:04.123613 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgfrp" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="registry-server" containerID="cri-o://602eb0d7980e3041fe94e529c45afb8e2f4cf8ca2b3658556aae6bf943e33ce4" gracePeriod=2 Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.134166 4959 generic.go:334] "Generic (PLEG): container finished" podID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerID="602eb0d7980e3041fe94e529c45afb8e2f4cf8ca2b3658556aae6bf943e33ce4" exitCode=0 Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.134370 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerDied","Data":"602eb0d7980e3041fe94e529c45afb8e2f4cf8ca2b3658556aae6bf943e33ce4"} Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.277148 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.416345 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content\") pod \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.416410 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities\") pod \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.416490 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5gfk\" (UniqueName: \"kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk\") pod \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\" (UID: \"9ef4dc93-6847-409b-a0d0-39573d63d5f4\") " Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.417673 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities" (OuterVolumeSpecName: "utilities") pod "9ef4dc93-6847-409b-a0d0-39573d63d5f4" (UID: "9ef4dc93-6847-409b-a0d0-39573d63d5f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.422765 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk" (OuterVolumeSpecName: "kube-api-access-d5gfk") pod "9ef4dc93-6847-409b-a0d0-39573d63d5f4" (UID: "9ef4dc93-6847-409b-a0d0-39573d63d5f4"). InnerVolumeSpecName "kube-api-access-d5gfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.520915 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.520967 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5gfk\" (UniqueName: \"kubernetes.io/projected/9ef4dc93-6847-409b-a0d0-39573d63d5f4-kube-api-access-d5gfk\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.895778 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ef4dc93-6847-409b-a0d0-39573d63d5f4" (UID: "9ef4dc93-6847-409b-a0d0-39573d63d5f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:05 crc kubenswrapper[4959]: I1007 13:35:05.929144 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef4dc93-6847-409b-a0d0-39573d63d5f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.142597 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfrp" event={"ID":"9ef4dc93-6847-409b-a0d0-39573d63d5f4","Type":"ContainerDied","Data":"16fab1373d70e30eb112ceea4b62d3f4bf9096febf0060ff17596cb494b284b8"} Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.142664 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfrp" Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.142684 4959 scope.go:117] "RemoveContainer" containerID="602eb0d7980e3041fe94e529c45afb8e2f4cf8ca2b3658556aae6bf943e33ce4" Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.161346 4959 scope.go:117] "RemoveContainer" containerID="0570ec4e6c0f12915df1e85237920ef6791913c71b18fd2e15c3c4f18de09720" Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.180067 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.199238 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgfrp"] Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.202038 4959 scope.go:117] "RemoveContainer" containerID="24c973079f441fd8184f0bd15b6f34f0484e6a655bba75f3db0e9f616e7495f6" Oct 07 13:35:06 crc kubenswrapper[4959]: I1007 13:35:06.823189 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" path="/var/lib/kubelet/pods/9ef4dc93-6847-409b-a0d0-39573d63d5f4/volumes" Oct 07 13:35:07 crc kubenswrapper[4959]: I1007 13:35:07.695731 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:35:07 crc kubenswrapper[4959]: I1007 13:35:07.696003 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:35:37 crc kubenswrapper[4959]: I1007 13:35:37.695907 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:35:37 crc kubenswrapper[4959]: I1007 13:35:37.696514 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.276156 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:45 crc kubenswrapper[4959]: E1007 13:35:45.277198 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="extract-utilities" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.277217 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="extract-utilities" Oct 07 13:35:45 crc kubenswrapper[4959]: E1007 13:35:45.277240 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="extract-content" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.277247 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="extract-content" Oct 07 13:35:45 crc kubenswrapper[4959]: E1007 13:35:45.277256 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="registry-server" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.277265 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="registry-server" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.277491 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef4dc93-6847-409b-a0d0-39573d63d5f4" containerName="registry-server" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.279052 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.287958 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.374972 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.375316 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.375450 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwsz\" (UniqueName: \"kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.477393 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwsz\" (UniqueName: \"kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.477515 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.477570 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.478075 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.478234 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.508485 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwsz\" (UniqueName: \"kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz\") pod \"redhat-marketplace-tknv9\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:45 crc kubenswrapper[4959]: I1007 13:35:45.605029 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:46 crc kubenswrapper[4959]: I1007 13:35:46.034759 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:46 crc kubenswrapper[4959]: I1007 13:35:46.544066 4959 generic.go:334] "Generic (PLEG): container finished" podID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerID="7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45" exitCode=0 Oct 07 13:35:46 crc kubenswrapper[4959]: I1007 13:35:46.544121 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerDied","Data":"7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45"} Oct 07 13:35:46 crc kubenswrapper[4959]: I1007 13:35:46.544458 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerStarted","Data":"e8e20438352e79d0293a9521fd84b23466f53d93299d5c71b11bd3d8e82ff08f"} Oct 07 13:35:47 crc kubenswrapper[4959]: E1007 13:35:47.834005 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49720c12_fbfb_4c30_857c_1e2bcec0fc11.slice/crio-2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49720c12_fbfb_4c30_857c_1e2bcec0fc11.slice/crio-conmon-2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:35:48 crc kubenswrapper[4959]: I1007 13:35:48.563381 4959 generic.go:334] "Generic (PLEG): container finished" podID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerID="2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6" exitCode=0 Oct 07 13:35:48 crc kubenswrapper[4959]: I1007 13:35:48.563435 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerDied","Data":"2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6"} Oct 07 13:35:49 crc kubenswrapper[4959]: I1007 13:35:49.576224 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerStarted","Data":"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178"} Oct 07 13:35:49 crc kubenswrapper[4959]: I1007 13:35:49.599865 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tknv9" podStartSLOduration=1.920420883 podStartE2EDuration="4.599847196s" podCreationTimestamp="2025-10-07 13:35:45 +0000 UTC" firstStartedPulling="2025-10-07 13:35:46.5459935 +0000 UTC m=+2098.706716177" lastFinishedPulling="2025-10-07 13:35:49.225419813 +0000 UTC m=+2101.386142490" observedRunningTime="2025-10-07 13:35:49.595939424 +0000 UTC m=+2101.756662121" watchObservedRunningTime="2025-10-07 13:35:49.599847196 +0000 UTC m=+2101.760569873" Oct 07 13:35:53 crc kubenswrapper[4959]: I1007 13:35:53.794260 4959 scope.go:117] "RemoveContainer" containerID="859a1f1699d4d70ac2543998c1dee8c5f536514e5364ee1d785901df3052f839" Oct 07 13:35:53 crc kubenswrapper[4959]: I1007 13:35:53.867194 4959 scope.go:117] "RemoveContainer" containerID="028feddaa85176da70d71bba1ecd8f32911fe676a12b1e045cab3cc2f8703659" Oct 07 13:35:53 crc kubenswrapper[4959]: I1007 13:35:53.919697 4959 scope.go:117] "RemoveContainer" containerID="b69b14929ea7ddc91ae43a3298eeed4d06333baa0348a4b99c72946637cccf8c" Oct 07 13:35:55 crc kubenswrapper[4959]: I1007 13:35:55.606231 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:55 crc kubenswrapper[4959]: I1007 13:35:55.606567 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:55 crc kubenswrapper[4959]: I1007 13:35:55.684282 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:55 crc kubenswrapper[4959]: I1007 13:35:55.730445 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:55 crc kubenswrapper[4959]: I1007 13:35:55.917451 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:57 crc kubenswrapper[4959]: I1007 13:35:57.634573 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tknv9" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="registry-server" containerID="cri-o://88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178" gracePeriod=2 Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.113483 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.205654 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities\") pod \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.205834 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnwsz\" (UniqueName: \"kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz\") pod \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.206166 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content\") pod \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\" (UID: \"49720c12-fbfb-4c30-857c-1e2bcec0fc11\") " Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.207688 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities" (OuterVolumeSpecName: "utilities") pod "49720c12-fbfb-4c30-857c-1e2bcec0fc11" (UID: "49720c12-fbfb-4c30-857c-1e2bcec0fc11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.224072 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz" (OuterVolumeSpecName: "kube-api-access-lnwsz") pod "49720c12-fbfb-4c30-857c-1e2bcec0fc11" (UID: "49720c12-fbfb-4c30-857c-1e2bcec0fc11"). InnerVolumeSpecName "kube-api-access-lnwsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.226273 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49720c12-fbfb-4c30-857c-1e2bcec0fc11" (UID: "49720c12-fbfb-4c30-857c-1e2bcec0fc11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.226933 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.226957 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnwsz\" (UniqueName: \"kubernetes.io/projected/49720c12-fbfb-4c30-857c-1e2bcec0fc11-kube-api-access-lnwsz\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.226970 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49720c12-fbfb-4c30-857c-1e2bcec0fc11-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.647466 4959 generic.go:334] "Generic (PLEG): container finished" podID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerID="88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178" exitCode=0 Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.647513 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerDied","Data":"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178"} Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.647869 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tknv9" event={"ID":"49720c12-fbfb-4c30-857c-1e2bcec0fc11","Type":"ContainerDied","Data":"e8e20438352e79d0293a9521fd84b23466f53d93299d5c71b11bd3d8e82ff08f"} Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.647546 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tknv9" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.647893 4959 scope.go:117] "RemoveContainer" containerID="88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.680792 4959 scope.go:117] "RemoveContainer" containerID="2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.687375 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.693146 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tknv9"] Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.712103 4959 scope.go:117] "RemoveContainer" containerID="7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.753518 4959 scope.go:117] "RemoveContainer" containerID="88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178" Oct 07 13:35:58 crc kubenswrapper[4959]: E1007 13:35:58.754077 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178\": container with ID starting with 88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178 not found: ID does not exist" containerID="88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.754130 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178"} err="failed to get container status \"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178\": rpc error: code = NotFound desc = could not find container \"88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178\": container with ID starting with 88b6b1e69efe45eabd20999a1a71b22a2930ce0c0140a434dc90b7de688b9178 not found: ID does not exist" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.754160 4959 scope.go:117] "RemoveContainer" containerID="2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6" Oct 07 13:35:58 crc kubenswrapper[4959]: E1007 13:35:58.754591 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6\": container with ID starting with 2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6 not found: ID does not exist" containerID="2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.754617 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6"} err="failed to get container status \"2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6\": rpc error: code = NotFound desc = could not find container \"2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6\": container with ID starting with 2acd8677c074e03619f3b35b436b6de423f9ea1286fe0d862c9ee5eb5eaf72a6 not found: ID does not exist" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.754651 4959 scope.go:117] "RemoveContainer" containerID="7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45" Oct 07 13:35:58 crc kubenswrapper[4959]: E1007 13:35:58.755031 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45\": container with ID starting with 7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45 not found: ID does not exist" containerID="7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.755107 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45"} err="failed to get container status \"7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45\": rpc error: code = NotFound desc = could not find container \"7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45\": container with ID starting with 7b286a83fab419ffe3fdbf507a2ec9ed1cadc029605b68d3a8e37aa6c7aa5b45 not found: ID does not exist" Oct 07 13:35:58 crc kubenswrapper[4959]: I1007 13:35:58.821048 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" path="/var/lib/kubelet/pods/49720c12-fbfb-4c30-857c-1e2bcec0fc11/volumes" Oct 07 13:36:07 crc kubenswrapper[4959]: I1007 13:36:07.698200 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:36:07 crc kubenswrapper[4959]: I1007 13:36:07.698693 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:36:07 crc kubenswrapper[4959]: I1007 13:36:07.698742 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:36:07 crc kubenswrapper[4959]: I1007 13:36:07.699470 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:36:07 crc kubenswrapper[4959]: I1007 13:36:07.699540 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353" gracePeriod=600 Oct 07 13:36:08 crc kubenswrapper[4959]: I1007 13:36:08.722977 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353" exitCode=0 Oct 07 13:36:08 crc kubenswrapper[4959]: I1007 13:36:08.723069 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353"} Oct 07 13:36:08 crc kubenswrapper[4959]: I1007 13:36:08.723536 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3"} Oct 07 13:36:08 crc kubenswrapper[4959]: I1007 13:36:08.723561 4959 scope.go:117] "RemoveContainer" containerID="e39a8d26eb52461ac6acb0f21169e58f030a0048b92a50e8ed705bb38bf52b3a" Oct 07 13:36:10 crc kubenswrapper[4959]: I1007 13:36:10.741178 4959 generic.go:334] "Generic (PLEG): container finished" podID="7be1a560-abc0-4b57-a960-85019afbe322" containerID="808954bd6f8cb0785eaac928f30d98e9d24a8a6411ab2515491c05c766975a8d" exitCode=0 Oct 07 13:36:10 crc kubenswrapper[4959]: I1007 13:36:10.741266 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" event={"ID":"7be1a560-abc0-4b57-a960-85019afbe322","Type":"ContainerDied","Data":"808954bd6f8cb0785eaac928f30d98e9d24a8a6411ab2515491c05c766975a8d"} Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.114253 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.289404 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory\") pod \"7be1a560-abc0-4b57-a960-85019afbe322\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.290121 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle\") pod \"7be1a560-abc0-4b57-a960-85019afbe322\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.290289 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph\") pod \"7be1a560-abc0-4b57-a960-85019afbe322\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.290443 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmc7p\" (UniqueName: \"kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p\") pod \"7be1a560-abc0-4b57-a960-85019afbe322\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.290589 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key\") pod \"7be1a560-abc0-4b57-a960-85019afbe322\" (UID: \"7be1a560-abc0-4b57-a960-85019afbe322\") " Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.298264 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7be1a560-abc0-4b57-a960-85019afbe322" (UID: "7be1a560-abc0-4b57-a960-85019afbe322"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.298396 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph" (OuterVolumeSpecName: "ceph") pod "7be1a560-abc0-4b57-a960-85019afbe322" (UID: "7be1a560-abc0-4b57-a960-85019afbe322"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.299150 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p" (OuterVolumeSpecName: "kube-api-access-hmc7p") pod "7be1a560-abc0-4b57-a960-85019afbe322" (UID: "7be1a560-abc0-4b57-a960-85019afbe322"). InnerVolumeSpecName "kube-api-access-hmc7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.320263 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory" (OuterVolumeSpecName: "inventory") pod "7be1a560-abc0-4b57-a960-85019afbe322" (UID: "7be1a560-abc0-4b57-a960-85019afbe322"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.320779 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7be1a560-abc0-4b57-a960-85019afbe322" (UID: "7be1a560-abc0-4b57-a960-85019afbe322"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.392687 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.392725 4959 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.392739 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.392749 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmc7p\" (UniqueName: \"kubernetes.io/projected/7be1a560-abc0-4b57-a960-85019afbe322-kube-api-access-hmc7p\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.392758 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7be1a560-abc0-4b57-a960-85019afbe322-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.756448 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" event={"ID":"7be1a560-abc0-4b57-a960-85019afbe322","Type":"ContainerDied","Data":"ae4fe4bece357b4e38c05e8d3421b2170a3e0963daba99db90297ff76d924476"} Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.756491 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae4fe4bece357b4e38c05e8d3421b2170a3e0963daba99db90297ff76d924476" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.756513 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.840955 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847"] Oct 07 13:36:12 crc kubenswrapper[4959]: E1007 13:36:12.841326 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="extract-utilities" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841343 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="extract-utilities" Oct 07 13:36:12 crc kubenswrapper[4959]: E1007 13:36:12.841364 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="registry-server" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841371 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="registry-server" Oct 07 13:36:12 crc kubenswrapper[4959]: E1007 13:36:12.841384 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="extract-content" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841390 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="extract-content" Oct 07 13:36:12 crc kubenswrapper[4959]: E1007 13:36:12.841411 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be1a560-abc0-4b57-a960-85019afbe322" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841418 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be1a560-abc0-4b57-a960-85019afbe322" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841606 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="49720c12-fbfb-4c30-857c-1e2bcec0fc11" containerName="registry-server" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.841641 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be1a560-abc0-4b57-a960-85019afbe322" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.842204 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.844528 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.844564 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.844802 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.845934 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.847006 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.851904 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847"] Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.903062 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.903119 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.903160 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:12 crc kubenswrapper[4959]: I1007 13:36:12.903242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.004763 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.004861 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.004935 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.004960 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.008895 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.008910 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.009431 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.020582 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wh847\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.169726 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.656482 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847"] Oct 07 13:36:13 crc kubenswrapper[4959]: I1007 13:36:13.765554 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" event={"ID":"2e1533f6-5266-414d-b116-f87c2acd344a","Type":"ContainerStarted","Data":"1cf8b96413c758565c787a1df070a87d3e835e07eb0d16a27d8c3e83e2338134"} Oct 07 13:36:14 crc kubenswrapper[4959]: I1007 13:36:14.777920 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" event={"ID":"2e1533f6-5266-414d-b116-f87c2acd344a","Type":"ContainerStarted","Data":"36a8bf5fb26ee1bf9a802d4fc01c36a680ab3ae5b10afd644dc855ff5dab1a39"} Oct 07 13:36:14 crc kubenswrapper[4959]: I1007 13:36:14.793079 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" podStartSLOduration=2.306392415 podStartE2EDuration="2.79305979s" podCreationTimestamp="2025-10-07 13:36:12 +0000 UTC" firstStartedPulling="2025-10-07 13:36:13.670839363 +0000 UTC m=+2125.831562050" lastFinishedPulling="2025-10-07 13:36:14.157506748 +0000 UTC m=+2126.318229425" observedRunningTime="2025-10-07 13:36:14.79201832 +0000 UTC m=+2126.952741007" watchObservedRunningTime="2025-10-07 13:36:14.79305979 +0000 UTC m=+2126.953782467" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.642920 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.645432 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.656853 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.676052 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2s5\" (UniqueName: \"kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.676213 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.676305 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.778400 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2s5\" (UniqueName: \"kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.778512 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.778578 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.779037 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.779087 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.810795 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2s5\" (UniqueName: \"kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5\") pod \"community-operators-jpp6f\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:16 crc kubenswrapper[4959]: I1007 13:36:16.967547 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:17 crc kubenswrapper[4959]: W1007 13:36:17.518831 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763c7d3a_ee98_4b07_ae25_d8f925643ca1.slice/crio-880e82fc03049bd625a790b28ce941a3e74ff2f415422198a29efc951391ec71 WatchSource:0}: Error finding container 880e82fc03049bd625a790b28ce941a3e74ff2f415422198a29efc951391ec71: Status 404 returned error can't find the container with id 880e82fc03049bd625a790b28ce941a3e74ff2f415422198a29efc951391ec71 Oct 07 13:36:17 crc kubenswrapper[4959]: I1007 13:36:17.532409 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:17 crc kubenswrapper[4959]: I1007 13:36:17.804741 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerStarted","Data":"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc"} Oct 07 13:36:17 crc kubenswrapper[4959]: I1007 13:36:17.804789 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerStarted","Data":"880e82fc03049bd625a790b28ce941a3e74ff2f415422198a29efc951391ec71"} Oct 07 13:36:18 crc kubenswrapper[4959]: I1007 13:36:18.815420 4959 generic.go:334] "Generic (PLEG): container finished" podID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerID="1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc" exitCode=0 Oct 07 13:36:18 crc kubenswrapper[4959]: I1007 13:36:18.817962 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerDied","Data":"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc"} Oct 07 13:36:20 crc kubenswrapper[4959]: I1007 13:36:20.835765 4959 generic.go:334] "Generic (PLEG): container finished" podID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerID="8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1" exitCode=0 Oct 07 13:36:20 crc kubenswrapper[4959]: I1007 13:36:20.835858 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerDied","Data":"8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1"} Oct 07 13:36:22 crc kubenswrapper[4959]: I1007 13:36:22.852000 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerStarted","Data":"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117"} Oct 07 13:36:22 crc kubenswrapper[4959]: I1007 13:36:22.870309 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jpp6f" podStartSLOduration=4.051405842 podStartE2EDuration="6.870287851s" podCreationTimestamp="2025-10-07 13:36:16 +0000 UTC" firstStartedPulling="2025-10-07 13:36:18.821609136 +0000 UTC m=+2130.982331813" lastFinishedPulling="2025-10-07 13:36:21.640491145 +0000 UTC m=+2133.801213822" observedRunningTime="2025-10-07 13:36:22.865603916 +0000 UTC m=+2135.026326613" watchObservedRunningTime="2025-10-07 13:36:22.870287851 +0000 UTC m=+2135.031010528" Oct 07 13:36:26 crc kubenswrapper[4959]: I1007 13:36:26.968428 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:26 crc kubenswrapper[4959]: I1007 13:36:26.968941 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:27 crc kubenswrapper[4959]: I1007 13:36:27.025982 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:27 crc kubenswrapper[4959]: I1007 13:36:27.933568 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:27 crc kubenswrapper[4959]: I1007 13:36:27.980561 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:29 crc kubenswrapper[4959]: I1007 13:36:29.898758 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jpp6f" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="registry-server" containerID="cri-o://cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117" gracePeriod=2 Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.412169 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.514308 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content\") pod \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.514760 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw2s5\" (UniqueName: \"kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5\") pod \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.514881 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities\") pod \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\" (UID: \"763c7d3a-ee98-4b07-ae25-d8f925643ca1\") " Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.515769 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities" (OuterVolumeSpecName: "utilities") pod "763c7d3a-ee98-4b07-ae25-d8f925643ca1" (UID: "763c7d3a-ee98-4b07-ae25-d8f925643ca1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.522864 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5" (OuterVolumeSpecName: "kube-api-access-lw2s5") pod "763c7d3a-ee98-4b07-ae25-d8f925643ca1" (UID: "763c7d3a-ee98-4b07-ae25-d8f925643ca1"). InnerVolumeSpecName "kube-api-access-lw2s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.571851 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "763c7d3a-ee98-4b07-ae25-d8f925643ca1" (UID: "763c7d3a-ee98-4b07-ae25-d8f925643ca1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.618772 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.618806 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw2s5\" (UniqueName: \"kubernetes.io/projected/763c7d3a-ee98-4b07-ae25-d8f925643ca1-kube-api-access-lw2s5\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.618817 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763c7d3a-ee98-4b07-ae25-d8f925643ca1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.908039 4959 generic.go:334] "Generic (PLEG): container finished" podID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerID="cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117" exitCode=0 Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.908095 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerDied","Data":"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117"} Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.908389 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpp6f" event={"ID":"763c7d3a-ee98-4b07-ae25-d8f925643ca1","Type":"ContainerDied","Data":"880e82fc03049bd625a790b28ce941a3e74ff2f415422198a29efc951391ec71"} Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.908412 4959 scope.go:117] "RemoveContainer" containerID="cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.908155 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpp6f" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.931375 4959 scope.go:117] "RemoveContainer" containerID="8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.940265 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.952008 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jpp6f"] Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.964219 4959 scope.go:117] "RemoveContainer" containerID="1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.993258 4959 scope.go:117] "RemoveContainer" containerID="cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117" Oct 07 13:36:30 crc kubenswrapper[4959]: E1007 13:36:30.993827 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117\": container with ID starting with cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117 not found: ID does not exist" containerID="cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.993875 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117"} err="failed to get container status \"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117\": rpc error: code = NotFound desc = could not find container \"cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117\": container with ID starting with cfd58638dc6ceea90949ad0750f4a93ffb37ff4df1f7ea5fe7089c22a4d27117 not found: ID does not exist" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.993903 4959 scope.go:117] "RemoveContainer" containerID="8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1" Oct 07 13:36:30 crc kubenswrapper[4959]: E1007 13:36:30.995039 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1\": container with ID starting with 8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1 not found: ID does not exist" containerID="8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.995073 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1"} err="failed to get container status \"8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1\": rpc error: code = NotFound desc = could not find container \"8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1\": container with ID starting with 8ef4e6b3c4c223acc086c66e618a51ede1e77642ee8bb7eb72b1824ec7c5a0a1 not found: ID does not exist" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.995091 4959 scope.go:117] "RemoveContainer" containerID="1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc" Oct 07 13:36:30 crc kubenswrapper[4959]: E1007 13:36:30.995353 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc\": container with ID starting with 1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc not found: ID does not exist" containerID="1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc" Oct 07 13:36:30 crc kubenswrapper[4959]: I1007 13:36:30.995390 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc"} err="failed to get container status \"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc\": rpc error: code = NotFound desc = could not find container \"1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc\": container with ID starting with 1418a001c202ae8e999166d5e372fbbc3f76feef1b55ce6c7ad09317b7e48bdc not found: ID does not exist" Oct 07 13:36:32 crc kubenswrapper[4959]: I1007 13:36:32.820092 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" path="/var/lib/kubelet/pods/763c7d3a-ee98-4b07-ae25-d8f925643ca1/volumes" Oct 07 13:36:37 crc kubenswrapper[4959]: I1007 13:36:37.963168 4959 generic.go:334] "Generic (PLEG): container finished" podID="2e1533f6-5266-414d-b116-f87c2acd344a" containerID="36a8bf5fb26ee1bf9a802d4fc01c36a680ab3ae5b10afd644dc855ff5dab1a39" exitCode=0 Oct 07 13:36:37 crc kubenswrapper[4959]: I1007 13:36:37.963799 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" event={"ID":"2e1533f6-5266-414d-b116-f87c2acd344a","Type":"ContainerDied","Data":"36a8bf5fb26ee1bf9a802d4fc01c36a680ab3ae5b10afd644dc855ff5dab1a39"} Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.365369 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.441080 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key\") pod \"2e1533f6-5266-414d-b116-f87c2acd344a\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.441189 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7\") pod \"2e1533f6-5266-414d-b116-f87c2acd344a\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.441281 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph\") pod \"2e1533f6-5266-414d-b116-f87c2acd344a\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.441341 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory\") pod \"2e1533f6-5266-414d-b116-f87c2acd344a\" (UID: \"2e1533f6-5266-414d-b116-f87c2acd344a\") " Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.447141 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph" (OuterVolumeSpecName: "ceph") pod "2e1533f6-5266-414d-b116-f87c2acd344a" (UID: "2e1533f6-5266-414d-b116-f87c2acd344a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.447397 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7" (OuterVolumeSpecName: "kube-api-access-pppk7") pod "2e1533f6-5266-414d-b116-f87c2acd344a" (UID: "2e1533f6-5266-414d-b116-f87c2acd344a"). InnerVolumeSpecName "kube-api-access-pppk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.473854 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory" (OuterVolumeSpecName: "inventory") pod "2e1533f6-5266-414d-b116-f87c2acd344a" (UID: "2e1533f6-5266-414d-b116-f87c2acd344a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.475894 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e1533f6-5266-414d-b116-f87c2acd344a" (UID: "2e1533f6-5266-414d-b116-f87c2acd344a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.543018 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/2e1533f6-5266-414d-b116-f87c2acd344a-kube-api-access-pppk7\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.543577 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.543678 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.543785 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e1533f6-5266-414d-b116-f87c2acd344a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.977932 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" event={"ID":"2e1533f6-5266-414d-b116-f87c2acd344a","Type":"ContainerDied","Data":"1cf8b96413c758565c787a1df070a87d3e835e07eb0d16a27d8c3e83e2338134"} Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.977992 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf8b96413c758565c787a1df070a87d3e835e07eb0d16a27d8c3e83e2338134" Oct 07 13:36:39 crc kubenswrapper[4959]: I1007 13:36:39.978050 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wh847" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.063371 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs"] Oct 07 13:36:40 crc kubenswrapper[4959]: E1007 13:36:40.063758 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="registry-server" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.063775 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="registry-server" Oct 07 13:36:40 crc kubenswrapper[4959]: E1007 13:36:40.063788 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="extract-utilities" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.063796 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="extract-utilities" Oct 07 13:36:40 crc kubenswrapper[4959]: E1007 13:36:40.063812 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1533f6-5266-414d-b116-f87c2acd344a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.063819 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1533f6-5266-414d-b116-f87c2acd344a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:40 crc kubenswrapper[4959]: E1007 13:36:40.063834 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="extract-content" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.063841 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="extract-content" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.064015 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="763c7d3a-ee98-4b07-ae25-d8f925643ca1" containerName="registry-server" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.064032 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1533f6-5266-414d-b116-f87c2acd344a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.064602 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.067071 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.067075 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.067278 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.067664 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.068531 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.070391 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs"] Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.151122 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.151184 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.151245 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.151271 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crxr\" (UniqueName: \"kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.253329 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.253693 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.253823 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crxr\" (UniqueName: \"kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.253967 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.258536 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.260467 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.260612 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.273297 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crxr\" (UniqueName: \"kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.391298 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.888658 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs"] Oct 07 13:36:40 crc kubenswrapper[4959]: I1007 13:36:40.988879 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" event={"ID":"0d3a592c-85aa-455c-a39e-cf2ec5c1f292","Type":"ContainerStarted","Data":"a445e16f6eee0c1f40a53cad890cc5c996a8e35187b5b2a46451e45a3ba90b7a"} Oct 07 13:36:42 crc kubenswrapper[4959]: I1007 13:36:42.003073 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" event={"ID":"0d3a592c-85aa-455c-a39e-cf2ec5c1f292","Type":"ContainerStarted","Data":"b1271aa71c473c1179fcfb8bdb63a1915b5f4ff5015e27c83cdc8dfe867f7c8e"} Oct 07 13:36:42 crc kubenswrapper[4959]: I1007 13:36:42.031930 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" podStartSLOduration=1.507145594 podStartE2EDuration="2.031910596s" podCreationTimestamp="2025-10-07 13:36:40 +0000 UTC" firstStartedPulling="2025-10-07 13:36:40.891141713 +0000 UTC m=+2153.051864390" lastFinishedPulling="2025-10-07 13:36:41.415906675 +0000 UTC m=+2153.576629392" observedRunningTime="2025-10-07 13:36:42.021673751 +0000 UTC m=+2154.182396508" watchObservedRunningTime="2025-10-07 13:36:42.031910596 +0000 UTC m=+2154.192633283" Oct 07 13:36:46 crc kubenswrapper[4959]: I1007 13:36:46.039581 4959 generic.go:334] "Generic (PLEG): container finished" podID="0d3a592c-85aa-455c-a39e-cf2ec5c1f292" containerID="b1271aa71c473c1179fcfb8bdb63a1915b5f4ff5015e27c83cdc8dfe867f7c8e" exitCode=0 Oct 07 13:36:46 crc kubenswrapper[4959]: I1007 13:36:46.039710 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" event={"ID":"0d3a592c-85aa-455c-a39e-cf2ec5c1f292","Type":"ContainerDied","Data":"b1271aa71c473c1179fcfb8bdb63a1915b5f4ff5015e27c83cdc8dfe867f7c8e"} Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.407313 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.494040 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory\") pod \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.494108 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key\") pod \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.494134 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph\") pod \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.494198 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crxr\" (UniqueName: \"kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr\") pod \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\" (UID: \"0d3a592c-85aa-455c-a39e-cf2ec5c1f292\") " Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.499575 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph" (OuterVolumeSpecName: "ceph") pod "0d3a592c-85aa-455c-a39e-cf2ec5c1f292" (UID: "0d3a592c-85aa-455c-a39e-cf2ec5c1f292"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.499944 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr" (OuterVolumeSpecName: "kube-api-access-9crxr") pod "0d3a592c-85aa-455c-a39e-cf2ec5c1f292" (UID: "0d3a592c-85aa-455c-a39e-cf2ec5c1f292"). InnerVolumeSpecName "kube-api-access-9crxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.520942 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d3a592c-85aa-455c-a39e-cf2ec5c1f292" (UID: "0d3a592c-85aa-455c-a39e-cf2ec5c1f292"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.526606 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory" (OuterVolumeSpecName: "inventory") pod "0d3a592c-85aa-455c-a39e-cf2ec5c1f292" (UID: "0d3a592c-85aa-455c-a39e-cf2ec5c1f292"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.595681 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.595711 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.595719 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:47 crc kubenswrapper[4959]: I1007 13:36:47.595732 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9crxr\" (UniqueName: \"kubernetes.io/projected/0d3a592c-85aa-455c-a39e-cf2ec5c1f292-kube-api-access-9crxr\") on node \"crc\" DevicePath \"\"" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.058298 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" event={"ID":"0d3a592c-85aa-455c-a39e-cf2ec5c1f292","Type":"ContainerDied","Data":"a445e16f6eee0c1f40a53cad890cc5c996a8e35187b5b2a46451e45a3ba90b7a"} Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.058378 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.058376 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a445e16f6eee0c1f40a53cad890cc5c996a8e35187b5b2a46451e45a3ba90b7a" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.152208 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997"] Oct 07 13:36:48 crc kubenswrapper[4959]: E1007 13:36:48.152546 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3a592c-85aa-455c-a39e-cf2ec5c1f292" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.152565 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3a592c-85aa-455c-a39e-cf2ec5c1f292" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.152840 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3a592c-85aa-455c-a39e-cf2ec5c1f292" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.153449 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.158539 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.162534 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.163618 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.165905 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.167455 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997"] Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.167743 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.205052 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7kcj\" (UniqueName: \"kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.205544 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.205876 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.206092 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.309029 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.309862 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7kcj\" (UniqueName: \"kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.310000 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.310150 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.315235 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.316212 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.316375 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.339708 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7kcj\" (UniqueName: \"kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b997\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:48 crc kubenswrapper[4959]: I1007 13:36:48.480695 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:36:49 crc kubenswrapper[4959]: I1007 13:36:49.025332 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997"] Oct 07 13:36:49 crc kubenswrapper[4959]: I1007 13:36:49.069397 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" event={"ID":"5e262fa9-5abf-4283-99ed-ead5affb1282","Type":"ContainerStarted","Data":"68b59c635c94170b9133060bba28d6d99f4d72b94e4a7b6420901c1640cef7b2"} Oct 07 13:36:49 crc kubenswrapper[4959]: I1007 13:36:49.642455 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:36:50 crc kubenswrapper[4959]: I1007 13:36:50.079555 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" event={"ID":"5e262fa9-5abf-4283-99ed-ead5affb1282","Type":"ContainerStarted","Data":"5fba550e9846a839f062ea1624f3833279ebad9cec52022a1b8db810c41d56a8"} Oct 07 13:36:50 crc kubenswrapper[4959]: I1007 13:36:50.095139 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" podStartSLOduration=1.4908131789999999 podStartE2EDuration="2.095117932s" podCreationTimestamp="2025-10-07 13:36:48 +0000 UTC" firstStartedPulling="2025-10-07 13:36:49.033777708 +0000 UTC m=+2161.194500385" lastFinishedPulling="2025-10-07 13:36:49.638082451 +0000 UTC m=+2161.798805138" observedRunningTime="2025-10-07 13:36:50.09505421 +0000 UTC m=+2162.255776897" watchObservedRunningTime="2025-10-07 13:36:50.095117932 +0000 UTC m=+2162.255840619" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.336391 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.338807 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.353919 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.476165 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.476445 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.476478 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kd5\" (UniqueName: \"kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.578426 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.578471 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.578500 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kd5\" (UniqueName: \"kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.579098 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.579124 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.610053 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kd5\" (UniqueName: \"kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5\") pod \"redhat-operators-fjjjm\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:11 crc kubenswrapper[4959]: I1007 13:37:11.668893 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:12 crc kubenswrapper[4959]: I1007 13:37:12.139901 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:12 crc kubenswrapper[4959]: I1007 13:37:12.290854 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerStarted","Data":"56d1b565425a88b02d8c7a2917a50be4d24fd69a7b8729a8a15d29975a5cc0cd"} Oct 07 13:37:13 crc kubenswrapper[4959]: I1007 13:37:13.302232 4959 generic.go:334] "Generic (PLEG): container finished" podID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerID="8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302" exitCode=0 Oct 07 13:37:13 crc kubenswrapper[4959]: I1007 13:37:13.302320 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerDied","Data":"8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302"} Oct 07 13:37:15 crc kubenswrapper[4959]: I1007 13:37:15.320830 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerStarted","Data":"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30"} Oct 07 13:37:16 crc kubenswrapper[4959]: I1007 13:37:16.333658 4959 generic.go:334] "Generic (PLEG): container finished" podID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerID="34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30" exitCode=0 Oct 07 13:37:16 crc kubenswrapper[4959]: I1007 13:37:16.333731 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerDied","Data":"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30"} Oct 07 13:37:19 crc kubenswrapper[4959]: I1007 13:37:19.370704 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerStarted","Data":"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab"} Oct 07 13:37:21 crc kubenswrapper[4959]: I1007 13:37:21.669114 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:21 crc kubenswrapper[4959]: I1007 13:37:21.670136 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:22 crc kubenswrapper[4959]: I1007 13:37:22.719862 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fjjjm" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="registry-server" probeResult="failure" output=< Oct 07 13:37:22 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 13:37:22 crc kubenswrapper[4959]: > Oct 07 13:37:25 crc kubenswrapper[4959]: I1007 13:37:25.420736 4959 generic.go:334] "Generic (PLEG): container finished" podID="5e262fa9-5abf-4283-99ed-ead5affb1282" containerID="5fba550e9846a839f062ea1624f3833279ebad9cec52022a1b8db810c41d56a8" exitCode=0 Oct 07 13:37:25 crc kubenswrapper[4959]: I1007 13:37:25.420810 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" event={"ID":"5e262fa9-5abf-4283-99ed-ead5affb1282","Type":"ContainerDied","Data":"5fba550e9846a839f062ea1624f3833279ebad9cec52022a1b8db810c41d56a8"} Oct 07 13:37:25 crc kubenswrapper[4959]: I1007 13:37:25.447522 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fjjjm" podStartSLOduration=8.879403442 podStartE2EDuration="14.447500864s" podCreationTimestamp="2025-10-07 13:37:11 +0000 UTC" firstStartedPulling="2025-10-07 13:37:13.304748302 +0000 UTC m=+2185.465470979" lastFinishedPulling="2025-10-07 13:37:18.872845724 +0000 UTC m=+2191.033568401" observedRunningTime="2025-10-07 13:37:19.392081537 +0000 UTC m=+2191.552804234" watchObservedRunningTime="2025-10-07 13:37:25.447500864 +0000 UTC m=+2197.608223541" Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.884919 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.980424 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7kcj\" (UniqueName: \"kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj\") pod \"5e262fa9-5abf-4283-99ed-ead5affb1282\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.980561 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph\") pod \"5e262fa9-5abf-4283-99ed-ead5affb1282\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.980705 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key\") pod \"5e262fa9-5abf-4283-99ed-ead5affb1282\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.980879 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory\") pod \"5e262fa9-5abf-4283-99ed-ead5affb1282\" (UID: \"5e262fa9-5abf-4283-99ed-ead5affb1282\") " Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.990824 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph" (OuterVolumeSpecName: "ceph") pod "5e262fa9-5abf-4283-99ed-ead5affb1282" (UID: "5e262fa9-5abf-4283-99ed-ead5affb1282"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:26 crc kubenswrapper[4959]: I1007 13:37:26.997557 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj" (OuterVolumeSpecName: "kube-api-access-f7kcj") pod "5e262fa9-5abf-4283-99ed-ead5affb1282" (UID: "5e262fa9-5abf-4283-99ed-ead5affb1282"). InnerVolumeSpecName "kube-api-access-f7kcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.010737 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e262fa9-5abf-4283-99ed-ead5affb1282" (UID: "5e262fa9-5abf-4283-99ed-ead5affb1282"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.015105 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory" (OuterVolumeSpecName: "inventory") pod "5e262fa9-5abf-4283-99ed-ead5affb1282" (UID: "5e262fa9-5abf-4283-99ed-ead5affb1282"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.083383 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.083427 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7kcj\" (UniqueName: \"kubernetes.io/projected/5e262fa9-5abf-4283-99ed-ead5affb1282-kube-api-access-f7kcj\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.083441 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.083452 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e262fa9-5abf-4283-99ed-ead5affb1282-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.440792 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" event={"ID":"5e262fa9-5abf-4283-99ed-ead5affb1282","Type":"ContainerDied","Data":"68b59c635c94170b9133060bba28d6d99f4d72b94e4a7b6420901c1640cef7b2"} Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.440833 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b59c635c94170b9133060bba28d6d99f4d72b94e4a7b6420901c1640cef7b2" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.440891 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b997" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.525939 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b"] Oct 07 13:37:27 crc kubenswrapper[4959]: E1007 13:37:27.526430 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e262fa9-5abf-4283-99ed-ead5affb1282" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.526458 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e262fa9-5abf-4283-99ed-ead5affb1282" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.526714 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e262fa9-5abf-4283-99ed-ead5affb1282" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.527551 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.529852 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.530138 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.530582 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.530726 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.530736 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.548889 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b"] Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.593260 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.593345 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.593480 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsp8\" (UniqueName: \"kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.593513 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.695615 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.695700 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsp8\" (UniqueName: \"kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.695728 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.695835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.700586 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.701063 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.711250 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.714077 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsp8\" (UniqueName: \"kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:27 crc kubenswrapper[4959]: I1007 13:37:27.846550 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:28 crc kubenswrapper[4959]: I1007 13:37:28.494231 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b"] Oct 07 13:37:29 crc kubenswrapper[4959]: I1007 13:37:29.458193 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" event={"ID":"090ad048-3bec-4657-b329-1fbdba663340","Type":"ContainerStarted","Data":"abc80f92b8c15ca4e1b826c80d8f84b8459b1ce08018b35d3236823685040b10"} Oct 07 13:37:29 crc kubenswrapper[4959]: I1007 13:37:29.458817 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" event={"ID":"090ad048-3bec-4657-b329-1fbdba663340","Type":"ContainerStarted","Data":"f8394fbe484009654b12c594b3d5e3a45a89ee7df7f5b303a0dab03af0e1959f"} Oct 07 13:37:29 crc kubenswrapper[4959]: I1007 13:37:29.476482 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" podStartSLOduration=1.9604586899999998 podStartE2EDuration="2.476465121s" podCreationTimestamp="2025-10-07 13:37:27 +0000 UTC" firstStartedPulling="2025-10-07 13:37:28.497487768 +0000 UTC m=+2200.658210455" lastFinishedPulling="2025-10-07 13:37:29.013494209 +0000 UTC m=+2201.174216886" observedRunningTime="2025-10-07 13:37:29.47226321 +0000 UTC m=+2201.632985887" watchObservedRunningTime="2025-10-07 13:37:29.476465121 +0000 UTC m=+2201.637187798" Oct 07 13:37:31 crc kubenswrapper[4959]: I1007 13:37:31.725162 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:31 crc kubenswrapper[4959]: I1007 13:37:31.781468 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:31 crc kubenswrapper[4959]: I1007 13:37:31.958340 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:33 crc kubenswrapper[4959]: I1007 13:37:33.499262 4959 generic.go:334] "Generic (PLEG): container finished" podID="090ad048-3bec-4657-b329-1fbdba663340" containerID="abc80f92b8c15ca4e1b826c80d8f84b8459b1ce08018b35d3236823685040b10" exitCode=0 Oct 07 13:37:33 crc kubenswrapper[4959]: I1007 13:37:33.499357 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" event={"ID":"090ad048-3bec-4657-b329-1fbdba663340","Type":"ContainerDied","Data":"abc80f92b8c15ca4e1b826c80d8f84b8459b1ce08018b35d3236823685040b10"} Oct 07 13:37:33 crc kubenswrapper[4959]: I1007 13:37:33.499789 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fjjjm" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="registry-server" containerID="cri-o://28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab" gracePeriod=2 Oct 07 13:37:33 crc kubenswrapper[4959]: I1007 13:37:33.947980 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.029276 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content\") pod \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.029355 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities\") pod \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.029459 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9kd5\" (UniqueName: \"kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5\") pod \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\" (UID: \"2b1d30ce-8cc6-4167-b255-bdde14f57ee2\") " Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.030494 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities" (OuterVolumeSpecName: "utilities") pod "2b1d30ce-8cc6-4167-b255-bdde14f57ee2" (UID: "2b1d30ce-8cc6-4167-b255-bdde14f57ee2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.041419 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5" (OuterVolumeSpecName: "kube-api-access-r9kd5") pod "2b1d30ce-8cc6-4167-b255-bdde14f57ee2" (UID: "2b1d30ce-8cc6-4167-b255-bdde14f57ee2"). InnerVolumeSpecName "kube-api-access-r9kd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.115172 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b1d30ce-8cc6-4167-b255-bdde14f57ee2" (UID: "2b1d30ce-8cc6-4167-b255-bdde14f57ee2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.131318 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.131352 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.131365 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9kd5\" (UniqueName: \"kubernetes.io/projected/2b1d30ce-8cc6-4167-b255-bdde14f57ee2-kube-api-access-r9kd5\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.516909 4959 generic.go:334] "Generic (PLEG): container finished" podID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerID="28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab" exitCode=0 Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.516972 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerDied","Data":"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab"} Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.517032 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjjjm" event={"ID":"2b1d30ce-8cc6-4167-b255-bdde14f57ee2","Type":"ContainerDied","Data":"56d1b565425a88b02d8c7a2917a50be4d24fd69a7b8729a8a15d29975a5cc0cd"} Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.517025 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjjjm" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.517055 4959 scope.go:117] "RemoveContainer" containerID="28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.550952 4959 scope.go:117] "RemoveContainer" containerID="34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.560052 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.583000 4959 scope.go:117] "RemoveContainer" containerID="8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.585531 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fjjjm"] Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.616246 4959 scope.go:117] "RemoveContainer" containerID="28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab" Oct 07 13:37:34 crc kubenswrapper[4959]: E1007 13:37:34.616760 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab\": container with ID starting with 28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab not found: ID does not exist" containerID="28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.616823 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab"} err="failed to get container status \"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab\": rpc error: code = NotFound desc = could not find container \"28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab\": container with ID starting with 28e8af0717a4d459dd7579582a1ae50153caa219cf4f61303ca0504dd4819aab not found: ID does not exist" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.616863 4959 scope.go:117] "RemoveContainer" containerID="34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30" Oct 07 13:37:34 crc kubenswrapper[4959]: E1007 13:37:34.617208 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30\": container with ID starting with 34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30 not found: ID does not exist" containerID="34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.617236 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30"} err="failed to get container status \"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30\": rpc error: code = NotFound desc = could not find container \"34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30\": container with ID starting with 34c8bcc4dbfe9796b2ce5ab47b2d5727927f51ed059199227d86f34222390b30 not found: ID does not exist" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.617260 4959 scope.go:117] "RemoveContainer" containerID="8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302" Oct 07 13:37:34 crc kubenswrapper[4959]: E1007 13:37:34.617482 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302\": container with ID starting with 8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302 not found: ID does not exist" containerID="8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.617498 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302"} err="failed to get container status \"8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302\": rpc error: code = NotFound desc = could not find container \"8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302\": container with ID starting with 8d9671137993c6606b7eb0a7a4edb06aec7127fb540e507d383d2d8ec12c9302 not found: ID does not exist" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.847752 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" path="/var/lib/kubelet/pods/2b1d30ce-8cc6-4167-b255-bdde14f57ee2/volumes" Oct 07 13:37:34 crc kubenswrapper[4959]: I1007 13:37:34.989679 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.048994 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph\") pod \"090ad048-3bec-4657-b329-1fbdba663340\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.049114 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory\") pod \"090ad048-3bec-4657-b329-1fbdba663340\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.049169 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nsp8\" (UniqueName: \"kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8\") pod \"090ad048-3bec-4657-b329-1fbdba663340\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.049306 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key\") pod \"090ad048-3bec-4657-b329-1fbdba663340\" (UID: \"090ad048-3bec-4657-b329-1fbdba663340\") " Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.053855 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8" (OuterVolumeSpecName: "kube-api-access-2nsp8") pod "090ad048-3bec-4657-b329-1fbdba663340" (UID: "090ad048-3bec-4657-b329-1fbdba663340"). InnerVolumeSpecName "kube-api-access-2nsp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.054036 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph" (OuterVolumeSpecName: "ceph") pod "090ad048-3bec-4657-b329-1fbdba663340" (UID: "090ad048-3bec-4657-b329-1fbdba663340"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.072921 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "090ad048-3bec-4657-b329-1fbdba663340" (UID: "090ad048-3bec-4657-b329-1fbdba663340"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.073695 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory" (OuterVolumeSpecName: "inventory") pod "090ad048-3bec-4657-b329-1fbdba663340" (UID: "090ad048-3bec-4657-b329-1fbdba663340"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.151025 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nsp8\" (UniqueName: \"kubernetes.io/projected/090ad048-3bec-4657-b329-1fbdba663340-kube-api-access-2nsp8\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.151528 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.151588 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.151674 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ad048-3bec-4657-b329-1fbdba663340-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.531184 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" event={"ID":"090ad048-3bec-4657-b329-1fbdba663340","Type":"ContainerDied","Data":"f8394fbe484009654b12c594b3d5e3a45a89ee7df7f5b303a0dab03af0e1959f"} Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.531228 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8394fbe484009654b12c594b3d5e3a45a89ee7df7f5b303a0dab03af0e1959f" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.531238 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608157 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98"] Oct 07 13:37:35 crc kubenswrapper[4959]: E1007 13:37:35.608536 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="registry-server" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608549 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="registry-server" Oct 07 13:37:35 crc kubenswrapper[4959]: E1007 13:37:35.608564 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090ad048-3bec-4657-b329-1fbdba663340" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608571 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="090ad048-3bec-4657-b329-1fbdba663340" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:35 crc kubenswrapper[4959]: E1007 13:37:35.608595 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="extract-content" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608602 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="extract-content" Oct 07 13:37:35 crc kubenswrapper[4959]: E1007 13:37:35.608661 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="extract-utilities" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608670 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="extract-utilities" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608868 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1d30ce-8cc6-4167-b255-bdde14f57ee2" containerName="registry-server" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.608890 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="090ad048-3bec-4657-b329-1fbdba663340" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.609607 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.612127 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.612642 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.612749 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.612773 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.621186 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98"] Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.621371 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.660456 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.660520 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.660559 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.660607 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.762413 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.762475 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.762510 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.762542 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.766875 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.766934 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.767468 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.783677 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vgg98\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:35 crc kubenswrapper[4959]: I1007 13:37:35.929396 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:37:36 crc kubenswrapper[4959]: I1007 13:37:36.434651 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98"] Oct 07 13:37:36 crc kubenswrapper[4959]: I1007 13:37:36.540221 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" event={"ID":"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860","Type":"ContainerStarted","Data":"1987b88708dcd0199b80f0e9bc0f26e9877cde648fc5f9c6c859fc9705455f2e"} Oct 07 13:37:37 crc kubenswrapper[4959]: I1007 13:37:37.549065 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" event={"ID":"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860","Type":"ContainerStarted","Data":"84615693f07e6dfded84bb68a2c963ccefb252ec9aa9022b1d1408af17535479"} Oct 07 13:37:37 crc kubenswrapper[4959]: I1007 13:37:37.571234 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" podStartSLOduration=2.101345927 podStartE2EDuration="2.571212957s" podCreationTimestamp="2025-10-07 13:37:35 +0000 UTC" firstStartedPulling="2025-10-07 13:37:36.431600749 +0000 UTC m=+2208.592323466" lastFinishedPulling="2025-10-07 13:37:36.901467819 +0000 UTC m=+2209.062190496" observedRunningTime="2025-10-07 13:37:37.563198436 +0000 UTC m=+2209.723921113" watchObservedRunningTime="2025-10-07 13:37:37.571212957 +0000 UTC m=+2209.731935634" Oct 07 13:38:17 crc kubenswrapper[4959]: I1007 13:38:17.875102 4959 generic.go:334] "Generic (PLEG): container finished" podID="cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" containerID="84615693f07e6dfded84bb68a2c963ccefb252ec9aa9022b1d1408af17535479" exitCode=0 Oct 07 13:38:17 crc kubenswrapper[4959]: I1007 13:38:17.875222 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" event={"ID":"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860","Type":"ContainerDied","Data":"84615693f07e6dfded84bb68a2c963ccefb252ec9aa9022b1d1408af17535479"} Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.255524 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.264273 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph\") pod \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.264406 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory\") pod \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.264457 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w\") pod \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.264526 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key\") pod \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\" (UID: \"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860\") " Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.270183 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph" (OuterVolumeSpecName: "ceph") pod "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" (UID: "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.270317 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w" (OuterVolumeSpecName: "kube-api-access-ntt7w") pod "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" (UID: "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860"). InnerVolumeSpecName "kube-api-access-ntt7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.301120 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory" (OuterVolumeSpecName: "inventory") pod "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" (UID: "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.303911 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" (UID: "cffeb5da-ab9c-4c47-a6e2-2e647c4ac860"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.366122 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.366169 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-kube-api-access-ntt7w\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.366180 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.366188 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cffeb5da-ab9c-4c47-a6e2-2e647c4ac860-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.896118 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" event={"ID":"cffeb5da-ab9c-4c47-a6e2-2e647c4ac860","Type":"ContainerDied","Data":"1987b88708dcd0199b80f0e9bc0f26e9877cde648fc5f9c6c859fc9705455f2e"} Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.896202 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1987b88708dcd0199b80f0e9bc0f26e9877cde648fc5f9c6c859fc9705455f2e" Oct 07 13:38:19 crc kubenswrapper[4959]: I1007 13:38:19.896216 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vgg98" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.019912 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5cbpb"] Oct 07 13:38:20 crc kubenswrapper[4959]: E1007 13:38:20.020888 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.020922 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.021232 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffeb5da-ab9c-4c47-a6e2-2e647c4ac860" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.024702 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.027093 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.027809 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.029131 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.029191 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.030499 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.042562 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5cbpb"] Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.079775 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.080345 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g52q\" (UniqueName: \"kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.080441 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.080506 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.181658 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g52q\" (UniqueName: \"kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.181786 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.181847 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.181978 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.185385 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.187186 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.188911 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.206147 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g52q\" (UniqueName: \"kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q\") pod \"ssh-known-hosts-edpm-deployment-5cbpb\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.355981 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.892580 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5cbpb"] Oct 07 13:38:20 crc kubenswrapper[4959]: I1007 13:38:20.911710 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" event={"ID":"e4d99350-2d4f-451a-a539-e7a72f41ad3a","Type":"ContainerStarted","Data":"058d16a2174fe7c8b9251a1ae334cb5832e18c82d808058f4a4123a06d3a9c27"} Oct 07 13:38:21 crc kubenswrapper[4959]: I1007 13:38:21.921521 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" event={"ID":"e4d99350-2d4f-451a-a539-e7a72f41ad3a","Type":"ContainerStarted","Data":"2f1f367bd7da7de9156614ff6e82399e1733db0c89f9ee484bd5360451c91cc1"} Oct 07 13:38:21 crc kubenswrapper[4959]: I1007 13:38:21.944298 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" podStartSLOduration=2.320702873 podStartE2EDuration="2.944278571s" podCreationTimestamp="2025-10-07 13:38:19 +0000 UTC" firstStartedPulling="2025-10-07 13:38:20.903688004 +0000 UTC m=+2253.064410681" lastFinishedPulling="2025-10-07 13:38:21.527263712 +0000 UTC m=+2253.687986379" observedRunningTime="2025-10-07 13:38:21.938359311 +0000 UTC m=+2254.099081988" watchObservedRunningTime="2025-10-07 13:38:21.944278571 +0000 UTC m=+2254.105001248" Oct 07 13:38:31 crc kubenswrapper[4959]: I1007 13:38:31.008396 4959 generic.go:334] "Generic (PLEG): container finished" podID="e4d99350-2d4f-451a-a539-e7a72f41ad3a" containerID="2f1f367bd7da7de9156614ff6e82399e1733db0c89f9ee484bd5360451c91cc1" exitCode=0 Oct 07 13:38:31 crc kubenswrapper[4959]: I1007 13:38:31.008563 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" event={"ID":"e4d99350-2d4f-451a-a539-e7a72f41ad3a","Type":"ContainerDied","Data":"2f1f367bd7da7de9156614ff6e82399e1733db0c89f9ee484bd5360451c91cc1"} Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.374654 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.514260 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0\") pod \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.514487 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph\") pod \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.515037 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam\") pod \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.515071 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g52q\" (UniqueName: \"kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q\") pod \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\" (UID: \"e4d99350-2d4f-451a-a539-e7a72f41ad3a\") " Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.519784 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q" (OuterVolumeSpecName: "kube-api-access-2g52q") pod "e4d99350-2d4f-451a-a539-e7a72f41ad3a" (UID: "e4d99350-2d4f-451a-a539-e7a72f41ad3a"). InnerVolumeSpecName "kube-api-access-2g52q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.519817 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph" (OuterVolumeSpecName: "ceph") pod "e4d99350-2d4f-451a-a539-e7a72f41ad3a" (UID: "e4d99350-2d4f-451a-a539-e7a72f41ad3a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.559046 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e4d99350-2d4f-451a-a539-e7a72f41ad3a" (UID: "e4d99350-2d4f-451a-a539-e7a72f41ad3a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.565801 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e4d99350-2d4f-451a-a539-e7a72f41ad3a" (UID: "e4d99350-2d4f-451a-a539-e7a72f41ad3a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.616807 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.616956 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.617024 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g52q\" (UniqueName: \"kubernetes.io/projected/e4d99350-2d4f-451a-a539-e7a72f41ad3a-kube-api-access-2g52q\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:32 crc kubenswrapper[4959]: I1007 13:38:32.617083 4959 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e4d99350-2d4f-451a-a539-e7a72f41ad3a-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.025494 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" event={"ID":"e4d99350-2d4f-451a-a539-e7a72f41ad3a","Type":"ContainerDied","Data":"058d16a2174fe7c8b9251a1ae334cb5832e18c82d808058f4a4123a06d3a9c27"} Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.025803 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="058d16a2174fe7c8b9251a1ae334cb5832e18c82d808058f4a4123a06d3a9c27" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.025530 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5cbpb" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.092926 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5"] Oct 07 13:38:33 crc kubenswrapper[4959]: E1007 13:38:33.093298 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d99350-2d4f-451a-a539-e7a72f41ad3a" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.093314 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d99350-2d4f-451a-a539-e7a72f41ad3a" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.093514 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4d99350-2d4f-451a-a539-e7a72f41ad3a" containerName="ssh-known-hosts-edpm-deployment" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.094096 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.096594 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.097047 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.097222 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.097394 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.101130 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.107932 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5"] Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.226942 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.226988 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.227021 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbsv\" (UniqueName: \"kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.227191 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.329920 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.330058 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.330104 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.330131 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbsv\" (UniqueName: \"kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.334674 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.334703 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.339065 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.353675 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbsv\" (UniqueName: \"kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-t4pk5\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.413303 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:33 crc kubenswrapper[4959]: I1007 13:38:33.930685 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5"] Oct 07 13:38:34 crc kubenswrapper[4959]: I1007 13:38:34.034260 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" event={"ID":"feecb62b-99f0-41a7-80ce-3e8538801512","Type":"ContainerStarted","Data":"cc887746ecd9f40cc9a384800c925e4568f82d9d13c52d22f2c1b9454e3518e2"} Oct 07 13:38:36 crc kubenswrapper[4959]: I1007 13:38:36.068080 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" event={"ID":"feecb62b-99f0-41a7-80ce-3e8538801512","Type":"ContainerStarted","Data":"89aa1ba490c4f06f49dae33b40d9e6d9ec4afca92a089a0cd1a34a3971b7a380"} Oct 07 13:38:36 crc kubenswrapper[4959]: I1007 13:38:36.093617 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" podStartSLOduration=2.251689432 podStartE2EDuration="3.093586388s" podCreationTimestamp="2025-10-07 13:38:33 +0000 UTC" firstStartedPulling="2025-10-07 13:38:33.932329307 +0000 UTC m=+2266.093051994" lastFinishedPulling="2025-10-07 13:38:34.774226283 +0000 UTC m=+2266.934948950" observedRunningTime="2025-10-07 13:38:36.086839164 +0000 UTC m=+2268.247561841" watchObservedRunningTime="2025-10-07 13:38:36.093586388 +0000 UTC m=+2268.254309065" Oct 07 13:38:37 crc kubenswrapper[4959]: I1007 13:38:37.695777 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:38:37 crc kubenswrapper[4959]: I1007 13:38:37.696103 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:38:43 crc kubenswrapper[4959]: I1007 13:38:43.128871 4959 generic.go:334] "Generic (PLEG): container finished" podID="feecb62b-99f0-41a7-80ce-3e8538801512" containerID="89aa1ba490c4f06f49dae33b40d9e6d9ec4afca92a089a0cd1a34a3971b7a380" exitCode=0 Oct 07 13:38:43 crc kubenswrapper[4959]: I1007 13:38:43.128959 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" event={"ID":"feecb62b-99f0-41a7-80ce-3e8538801512","Type":"ContainerDied","Data":"89aa1ba490c4f06f49dae33b40d9e6d9ec4afca92a089a0cd1a34a3971b7a380"} Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.526235 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.696192 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbsv\" (UniqueName: \"kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv\") pod \"feecb62b-99f0-41a7-80ce-3e8538801512\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.696700 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory\") pod \"feecb62b-99f0-41a7-80ce-3e8538801512\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.696770 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key\") pod \"feecb62b-99f0-41a7-80ce-3e8538801512\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.696877 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph\") pod \"feecb62b-99f0-41a7-80ce-3e8538801512\" (UID: \"feecb62b-99f0-41a7-80ce-3e8538801512\") " Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.706138 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv" (OuterVolumeSpecName: "kube-api-access-8cbsv") pod "feecb62b-99f0-41a7-80ce-3e8538801512" (UID: "feecb62b-99f0-41a7-80ce-3e8538801512"). InnerVolumeSpecName "kube-api-access-8cbsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.710826 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph" (OuterVolumeSpecName: "ceph") pod "feecb62b-99f0-41a7-80ce-3e8538801512" (UID: "feecb62b-99f0-41a7-80ce-3e8538801512"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.729964 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory" (OuterVolumeSpecName: "inventory") pod "feecb62b-99f0-41a7-80ce-3e8538801512" (UID: "feecb62b-99f0-41a7-80ce-3e8538801512"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.754203 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "feecb62b-99f0-41a7-80ce-3e8538801512" (UID: "feecb62b-99f0-41a7-80ce-3e8538801512"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.798798 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.798846 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbsv\" (UniqueName: \"kubernetes.io/projected/feecb62b-99f0-41a7-80ce-3e8538801512-kube-api-access-8cbsv\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.798862 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:44 crc kubenswrapper[4959]: I1007 13:38:44.798870 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feecb62b-99f0-41a7-80ce-3e8538801512-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.149350 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" event={"ID":"feecb62b-99f0-41a7-80ce-3e8538801512","Type":"ContainerDied","Data":"cc887746ecd9f40cc9a384800c925e4568f82d9d13c52d22f2c1b9454e3518e2"} Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.149954 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc887746ecd9f40cc9a384800c925e4568f82d9d13c52d22f2c1b9454e3518e2" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.149399 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-t4pk5" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.229262 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv"] Oct 07 13:38:45 crc kubenswrapper[4959]: E1007 13:38:45.229926 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feecb62b-99f0-41a7-80ce-3e8538801512" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.229952 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="feecb62b-99f0-41a7-80ce-3e8538801512" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.230196 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="feecb62b-99f0-41a7-80ce-3e8538801512" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.230988 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.233825 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.236579 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.236830 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.236841 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.236975 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.255556 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv"] Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.411459 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.411881 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.412069 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc7d9\" (UniqueName: \"kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.412262 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.513542 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.513747 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.513776 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.513905 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc7d9\" (UniqueName: \"kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.518975 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.524455 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.525044 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.534055 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc7d9\" (UniqueName: \"kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:45 crc kubenswrapper[4959]: I1007 13:38:45.553904 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:46 crc kubenswrapper[4959]: I1007 13:38:46.078167 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv"] Oct 07 13:38:46 crc kubenswrapper[4959]: I1007 13:38:46.157848 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" event={"ID":"1d2aa3cc-f250-4d9e-b6da-921018115809","Type":"ContainerStarted","Data":"38917bb0076b0f305c7f5dff2f17b8fe4374173919baf6f7745edcdce10d47c8"} Oct 07 13:38:48 crc kubenswrapper[4959]: I1007 13:38:48.179639 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" event={"ID":"1d2aa3cc-f250-4d9e-b6da-921018115809","Type":"ContainerStarted","Data":"0c3cae13fe67da73961024f005483e58583b84e60e337386c92cd3728df1a09b"} Oct 07 13:38:56 crc kubenswrapper[4959]: I1007 13:38:56.245812 4959 generic.go:334] "Generic (PLEG): container finished" podID="1d2aa3cc-f250-4d9e-b6da-921018115809" containerID="0c3cae13fe67da73961024f005483e58583b84e60e337386c92cd3728df1a09b" exitCode=0 Oct 07 13:38:56 crc kubenswrapper[4959]: I1007 13:38:56.245930 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" event={"ID":"1d2aa3cc-f250-4d9e-b6da-921018115809","Type":"ContainerDied","Data":"0c3cae13fe67da73961024f005483e58583b84e60e337386c92cd3728df1a09b"} Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.649528 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.827784 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key\") pod \"1d2aa3cc-f250-4d9e-b6da-921018115809\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.828223 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc7d9\" (UniqueName: \"kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9\") pod \"1d2aa3cc-f250-4d9e-b6da-921018115809\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.828274 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph\") pod \"1d2aa3cc-f250-4d9e-b6da-921018115809\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.828411 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory\") pod \"1d2aa3cc-f250-4d9e-b6da-921018115809\" (UID: \"1d2aa3cc-f250-4d9e-b6da-921018115809\") " Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.833904 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9" (OuterVolumeSpecName: "kube-api-access-jc7d9") pod "1d2aa3cc-f250-4d9e-b6da-921018115809" (UID: "1d2aa3cc-f250-4d9e-b6da-921018115809"). InnerVolumeSpecName "kube-api-access-jc7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.833955 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph" (OuterVolumeSpecName: "ceph") pod "1d2aa3cc-f250-4d9e-b6da-921018115809" (UID: "1d2aa3cc-f250-4d9e-b6da-921018115809"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.854088 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d2aa3cc-f250-4d9e-b6da-921018115809" (UID: "1d2aa3cc-f250-4d9e-b6da-921018115809"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.860392 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory" (OuterVolumeSpecName: "inventory") pod "1d2aa3cc-f250-4d9e-b6da-921018115809" (UID: "1d2aa3cc-f250-4d9e-b6da-921018115809"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.930279 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.930315 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.930325 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc7d9\" (UniqueName: \"kubernetes.io/projected/1d2aa3cc-f250-4d9e-b6da-921018115809-kube-api-access-jc7d9\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:57 crc kubenswrapper[4959]: I1007 13:38:57.930336 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1d2aa3cc-f250-4d9e-b6da-921018115809-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.265011 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" event={"ID":"1d2aa3cc-f250-4d9e-b6da-921018115809","Type":"ContainerDied","Data":"38917bb0076b0f305c7f5dff2f17b8fe4374173919baf6f7745edcdce10d47c8"} Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.265052 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38917bb0076b0f305c7f5dff2f17b8fe4374173919baf6f7745edcdce10d47c8" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.265062 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.355366 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg"] Oct 07 13:38:58 crc kubenswrapper[4959]: E1007 13:38:58.355797 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2aa3cc-f250-4d9e-b6da-921018115809" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.355815 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2aa3cc-f250-4d9e-b6da-921018115809" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.355996 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2aa3cc-f250-4d9e-b6da-921018115809" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.356661 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.360995 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361197 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361214 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361214 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361294 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361339 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361376 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.361472 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.365298 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg"] Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.438780 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.438859 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.438883 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.438950 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439015 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439078 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439154 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439199 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgr7\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439220 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439273 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439295 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.439355 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.540877 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.540985 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541017 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541517 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541620 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541681 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgr7\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541710 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541760 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541776 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541821 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541854 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541875 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.541891 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.545311 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.545745 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546281 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546357 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546561 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546705 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546729 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.546923 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.547385 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.547715 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.547937 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.548459 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.557285 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgr7\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:58 crc kubenswrapper[4959]: I1007 13:38:58.683359 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:38:59 crc kubenswrapper[4959]: I1007 13:38:59.193740 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg"] Oct 07 13:38:59 crc kubenswrapper[4959]: I1007 13:38:59.274669 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" event={"ID":"1c5e92bc-6eae-4ed1-81e8-400019fc8a13","Type":"ContainerStarted","Data":"350b17f77043657cd909a88ab29f3a9431e667f7ec51eec82645cac15f449905"} Oct 07 13:39:00 crc kubenswrapper[4959]: I1007 13:39:00.284806 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" event={"ID":"1c5e92bc-6eae-4ed1-81e8-400019fc8a13","Type":"ContainerStarted","Data":"ccebee28aa625cfd6d55625f6d5169477367e05c81b18b2de1cf80e8a6bc91de"} Oct 07 13:39:00 crc kubenswrapper[4959]: I1007 13:39:00.315610 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" podStartSLOduration=1.674009286 podStartE2EDuration="2.315590283s" podCreationTimestamp="2025-10-07 13:38:58 +0000 UTC" firstStartedPulling="2025-10-07 13:38:59.200081468 +0000 UTC m=+2291.360804145" lastFinishedPulling="2025-10-07 13:38:59.841662465 +0000 UTC m=+2292.002385142" observedRunningTime="2025-10-07 13:39:00.307588183 +0000 UTC m=+2292.468310860" watchObservedRunningTime="2025-10-07 13:39:00.315590283 +0000 UTC m=+2292.476312960" Oct 07 13:39:07 crc kubenswrapper[4959]: I1007 13:39:07.696169 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:39:07 crc kubenswrapper[4959]: I1007 13:39:07.696819 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:39:30 crc kubenswrapper[4959]: I1007 13:39:30.556455 4959 generic.go:334] "Generic (PLEG): container finished" podID="1c5e92bc-6eae-4ed1-81e8-400019fc8a13" containerID="ccebee28aa625cfd6d55625f6d5169477367e05c81b18b2de1cf80e8a6bc91de" exitCode=0 Oct 07 13:39:30 crc kubenswrapper[4959]: I1007 13:39:30.556560 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" event={"ID":"1c5e92bc-6eae-4ed1-81e8-400019fc8a13","Type":"ContainerDied","Data":"ccebee28aa625cfd6d55625f6d5169477367e05c81b18b2de1cf80e8a6bc91de"} Oct 07 13:39:31 crc kubenswrapper[4959]: I1007 13:39:31.987313 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.158775 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.158913 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159004 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159045 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159094 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159182 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brgr7\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159273 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159340 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159375 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159459 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159496 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159528 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.159560 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle\") pod \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\" (UID: \"1c5e92bc-6eae-4ed1-81e8-400019fc8a13\") " Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.165834 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7" (OuterVolumeSpecName: "kube-api-access-brgr7") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "kube-api-access-brgr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.165829 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.166556 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.166619 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.167862 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.167972 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.169359 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.170089 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph" (OuterVolumeSpecName: "ceph") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.170115 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.170443 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.170756 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.190129 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.198334 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory" (OuterVolumeSpecName: "inventory") pod "1c5e92bc-6eae-4ed1-81e8-400019fc8a13" (UID: "1c5e92bc-6eae-4ed1-81e8-400019fc8a13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262115 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262160 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262178 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brgr7\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-kube-api-access-brgr7\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262193 4959 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262206 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262238 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262252 4959 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262268 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262284 4959 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262299 4959 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262310 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262322 4959 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.262334 4959 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5e92bc-6eae-4ed1-81e8-400019fc8a13-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.574687 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" event={"ID":"1c5e92bc-6eae-4ed1-81e8-400019fc8a13","Type":"ContainerDied","Data":"350b17f77043657cd909a88ab29f3a9431e667f7ec51eec82645cac15f449905"} Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.574739 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350b17f77043657cd909a88ab29f3a9431e667f7ec51eec82645cac15f449905" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.574741 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.677243 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42"] Oct 07 13:39:32 crc kubenswrapper[4959]: E1007 13:39:32.677796 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5e92bc-6eae-4ed1-81e8-400019fc8a13" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.677825 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5e92bc-6eae-4ed1-81e8-400019fc8a13" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.678063 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5e92bc-6eae-4ed1-81e8-400019fc8a13" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.678982 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.686370 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.687137 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.687467 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.687735 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.688390 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.696768 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42"] Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.873596 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.873673 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2pq\" (UniqueName: \"kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.873791 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.874056 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.975339 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.975478 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.975530 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.975560 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2pq\" (UniqueName: \"kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.980382 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.980922 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.987983 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.991699 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2pq\" (UniqueName: \"kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:32 crc kubenswrapper[4959]: I1007 13:39:32.995921 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:33 crc kubenswrapper[4959]: I1007 13:39:33.575910 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42"] Oct 07 13:39:33 crc kubenswrapper[4959]: I1007 13:39:33.587220 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:39:34 crc kubenswrapper[4959]: I1007 13:39:34.597877 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" event={"ID":"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1","Type":"ContainerStarted","Data":"72e847af2ad6ff28ce08896971cf3c5442f5bd737a12d458df0034fe45f4400a"} Oct 07 13:39:34 crc kubenswrapper[4959]: I1007 13:39:34.598227 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" event={"ID":"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1","Type":"ContainerStarted","Data":"5eac5ca9806366d353773cc626895567c8ccf1bba159a9964bfd09728e6563c6"} Oct 07 13:39:34 crc kubenswrapper[4959]: I1007 13:39:34.622478 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" podStartSLOduration=1.960661611 podStartE2EDuration="2.622460691s" podCreationTimestamp="2025-10-07 13:39:32 +0000 UTC" firstStartedPulling="2025-10-07 13:39:33.586896461 +0000 UTC m=+2325.747619138" lastFinishedPulling="2025-10-07 13:39:34.248695541 +0000 UTC m=+2326.409418218" observedRunningTime="2025-10-07 13:39:34.615734598 +0000 UTC m=+2326.776457295" watchObservedRunningTime="2025-10-07 13:39:34.622460691 +0000 UTC m=+2326.783183368" Oct 07 13:39:37 crc kubenswrapper[4959]: I1007 13:39:37.696255 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:39:37 crc kubenswrapper[4959]: I1007 13:39:37.696866 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:39:37 crc kubenswrapper[4959]: I1007 13:39:37.696929 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:39:37 crc kubenswrapper[4959]: I1007 13:39:37.697883 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:39:37 crc kubenswrapper[4959]: I1007 13:39:37.697950 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" gracePeriod=600 Oct 07 13:39:37 crc kubenswrapper[4959]: E1007 13:39:37.823474 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:39:38 crc kubenswrapper[4959]: I1007 13:39:38.635389 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" exitCode=0 Oct 07 13:39:38 crc kubenswrapper[4959]: I1007 13:39:38.635466 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3"} Oct 07 13:39:38 crc kubenswrapper[4959]: I1007 13:39:38.635743 4959 scope.go:117] "RemoveContainer" containerID="f02bde6494dabf886d665f280b5d309e0e1cc29275dd57e286af213216b21353" Oct 07 13:39:38 crc kubenswrapper[4959]: I1007 13:39:38.636483 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:39:38 crc kubenswrapper[4959]: E1007 13:39:38.636787 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:39:39 crc kubenswrapper[4959]: I1007 13:39:39.648121 4959 generic.go:334] "Generic (PLEG): container finished" podID="bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" containerID="72e847af2ad6ff28ce08896971cf3c5442f5bd737a12d458df0034fe45f4400a" exitCode=0 Oct 07 13:39:39 crc kubenswrapper[4959]: I1007 13:39:39.648167 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" event={"ID":"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1","Type":"ContainerDied","Data":"72e847af2ad6ff28ce08896971cf3c5442f5bd737a12d458df0034fe45f4400a"} Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.072988 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.233221 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory\") pod \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.233378 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph\") pod \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.233513 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key\") pod \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.233554 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2pq\" (UniqueName: \"kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq\") pod \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\" (UID: \"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1\") " Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.248978 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph" (OuterVolumeSpecName: "ceph") pod "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" (UID: "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.249211 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq" (OuterVolumeSpecName: "kube-api-access-hv2pq") pod "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" (UID: "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1"). InnerVolumeSpecName "kube-api-access-hv2pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.261886 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" (UID: "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.262262 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory" (OuterVolumeSpecName: "inventory") pod "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" (UID: "bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.335361 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.335389 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2pq\" (UniqueName: \"kubernetes.io/projected/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-kube-api-access-hv2pq\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.335399 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.335407 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.667998 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" event={"ID":"bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1","Type":"ContainerDied","Data":"5eac5ca9806366d353773cc626895567c8ccf1bba159a9964bfd09728e6563c6"} Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.668291 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eac5ca9806366d353773cc626895567c8ccf1bba159a9964bfd09728e6563c6" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.668103 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.865219 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp"] Oct 07 13:39:41 crc kubenswrapper[4959]: E1007 13:39:41.865605 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.865645 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.865817 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.866426 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.868902 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.872272 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.872616 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.872830 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.873004 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.873217 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.880455 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp"] Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.945877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.945949 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.945994 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.946031 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.946072 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:41 crc kubenswrapper[4959]: I1007 13:39:41.946155 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76spv\" (UniqueName: \"kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048149 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048245 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76spv\" (UniqueName: \"kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048310 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048332 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048362 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.048394 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.049604 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.053311 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.053389 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.053586 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.055108 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.065087 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76spv\" (UniqueName: \"kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2tmp\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.186478 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:39:42 crc kubenswrapper[4959]: I1007 13:39:42.696501 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp"] Oct 07 13:39:42 crc kubenswrapper[4959]: W1007 13:39:42.703119 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod153879ad_6c45_43f1_a7e7_6b7e2f4e8cf7.slice/crio-78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796 WatchSource:0}: Error finding container 78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796: Status 404 returned error can't find the container with id 78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796 Oct 07 13:39:43 crc kubenswrapper[4959]: I1007 13:39:43.689835 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" event={"ID":"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7","Type":"ContainerStarted","Data":"ef436d3ea126c11cdda14806197be14b037a118e4b687f9d6e44f7d48d76a541"} Oct 07 13:39:43 crc kubenswrapper[4959]: I1007 13:39:43.690429 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" event={"ID":"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7","Type":"ContainerStarted","Data":"78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796"} Oct 07 13:39:43 crc kubenswrapper[4959]: I1007 13:39:43.712171 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" podStartSLOduration=2.062607615 podStartE2EDuration="2.712152222s" podCreationTimestamp="2025-10-07 13:39:41 +0000 UTC" firstStartedPulling="2025-10-07 13:39:42.705343548 +0000 UTC m=+2334.866066225" lastFinishedPulling="2025-10-07 13:39:43.354888155 +0000 UTC m=+2335.515610832" observedRunningTime="2025-10-07 13:39:43.710069722 +0000 UTC m=+2335.870792419" watchObservedRunningTime="2025-10-07 13:39:43.712152222 +0000 UTC m=+2335.872874899" Oct 07 13:39:52 crc kubenswrapper[4959]: I1007 13:39:52.809491 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:39:52 crc kubenswrapper[4959]: E1007 13:39:52.810410 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:04 crc kubenswrapper[4959]: I1007 13:40:04.809833 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:40:04 crc kubenswrapper[4959]: E1007 13:40:04.811695 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:15 crc kubenswrapper[4959]: I1007 13:40:15.809121 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:40:15 crc kubenswrapper[4959]: E1007 13:40:15.809918 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:30 crc kubenswrapper[4959]: I1007 13:40:30.809223 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:40:30 crc kubenswrapper[4959]: E1007 13:40:30.810311 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:41 crc kubenswrapper[4959]: I1007 13:40:41.809746 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:40:41 crc kubenswrapper[4959]: E1007 13:40:41.811101 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:52 crc kubenswrapper[4959]: I1007 13:40:52.810716 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:40:52 crc kubenswrapper[4959]: E1007 13:40:52.814052 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:40:53 crc kubenswrapper[4959]: I1007 13:40:53.290782 4959 generic.go:334] "Generic (PLEG): container finished" podID="153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" containerID="ef436d3ea126c11cdda14806197be14b037a118e4b687f9d6e44f7d48d76a541" exitCode=0 Oct 07 13:40:53 crc kubenswrapper[4959]: I1007 13:40:53.290838 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" event={"ID":"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7","Type":"ContainerDied","Data":"ef436d3ea126c11cdda14806197be14b037a118e4b687f9d6e44f7d48d76a541"} Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.768822 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.901850 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.901958 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.901993 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.902068 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76spv\" (UniqueName: \"kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.902551 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.902934 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle\") pod \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\" (UID: \"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7\") " Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.908453 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv" (OuterVolumeSpecName: "kube-api-access-76spv") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "kube-api-access-76spv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.910817 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph" (OuterVolumeSpecName: "ceph") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.911107 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.929274 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.930594 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:40:54 crc kubenswrapper[4959]: I1007 13:40:54.933478 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory" (OuterVolumeSpecName: "inventory") pod "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" (UID: "153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007778 4959 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007811 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76spv\" (UniqueName: \"kubernetes.io/projected/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-kube-api-access-76spv\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007822 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007833 4959 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007842 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.007850 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.310430 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" event={"ID":"153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7","Type":"ContainerDied","Data":"78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796"} Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.310484 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78c7849325dba8a38b7c499c5c7a0fb4a3fbc7201df6af2a99034631448b9796" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.311073 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2tmp" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.402946 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx"] Oct 07 13:40:55 crc kubenswrapper[4959]: E1007 13:40:55.403459 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.403484 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.403767 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.404601 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.408290 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.408354 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.408703 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.408748 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.408297 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.409001 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.409064 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416469 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416571 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416673 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416705 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416759 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24cb\" (UniqueName: \"kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416808 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.416839 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.418654 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx"] Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.519353 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24cb\" (UniqueName: \"kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.519820 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.519857 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.519885 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.519976 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.520044 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.520071 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.524827 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.524844 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.526102 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.526785 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.527899 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.529869 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.547549 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24cb\" (UniqueName: \"kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:55 crc kubenswrapper[4959]: I1007 13:40:55.732268 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:40:56 crc kubenswrapper[4959]: I1007 13:40:56.292812 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx"] Oct 07 13:40:56 crc kubenswrapper[4959]: I1007 13:40:56.319535 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" event={"ID":"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c","Type":"ContainerStarted","Data":"3e69b471a2e8ac3edf65c53f2a167cb58bbbd71e11bc81544a19f073c5b2b49e"} Oct 07 13:40:58 crc kubenswrapper[4959]: I1007 13:40:58.336504 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" event={"ID":"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c","Type":"ContainerStarted","Data":"71106efd2985d6e0958f7d825cf2e4a19f272859e5ae99b2ce59a013bb5e0333"} Oct 07 13:41:06 crc kubenswrapper[4959]: I1007 13:41:06.809996 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:41:06 crc kubenswrapper[4959]: E1007 13:41:06.811730 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:41:20 crc kubenswrapper[4959]: I1007 13:41:20.809136 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:41:20 crc kubenswrapper[4959]: E1007 13:41:20.811750 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:41:35 crc kubenswrapper[4959]: I1007 13:41:35.809428 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:41:35 crc kubenswrapper[4959]: E1007 13:41:35.810377 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:41:49 crc kubenswrapper[4959]: I1007 13:41:49.809485 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:41:49 crc kubenswrapper[4959]: E1007 13:41:49.810850 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:41:59 crc kubenswrapper[4959]: I1007 13:41:59.852468 4959 generic.go:334] "Generic (PLEG): container finished" podID="fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" containerID="71106efd2985d6e0958f7d825cf2e4a19f272859e5ae99b2ce59a013bb5e0333" exitCode=0 Oct 07 13:41:59 crc kubenswrapper[4959]: I1007 13:41:59.852574 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" event={"ID":"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c","Type":"ContainerDied","Data":"71106efd2985d6e0958f7d825cf2e4a19f272859e5ae99b2ce59a013bb5e0333"} Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.236698 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327030 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q24cb\" (UniqueName: \"kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327084 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327135 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327156 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327177 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327219 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.327261 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\" (UID: \"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c\") " Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.332909 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.333156 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb" (OuterVolumeSpecName: "kube-api-access-q24cb") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "kube-api-access-q24cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.335606 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph" (OuterVolumeSpecName: "ceph") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.356698 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory" (OuterVolumeSpecName: "inventory") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.358574 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.360317 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.366762 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" (UID: "fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430389 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q24cb\" (UniqueName: \"kubernetes.io/projected/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-kube-api-access-q24cb\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430708 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430722 4959 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430735 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430746 4959 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430759 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.430770 4959 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.870096 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.870090 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx" event={"ID":"fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c","Type":"ContainerDied","Data":"3e69b471a2e8ac3edf65c53f2a167cb58bbbd71e11bc81544a19f073c5b2b49e"} Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.870237 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e69b471a2e8ac3edf65c53f2a167cb58bbbd71e11bc81544a19f073c5b2b49e" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.965708 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb"] Oct 07 13:42:01 crc kubenswrapper[4959]: E1007 13:42:01.966129 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.966149 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.966336 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.966958 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.969376 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.976134 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.976235 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szkdk" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.976302 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.976388 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.976493 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:42:01 crc kubenswrapper[4959]: I1007 13:42:01.981108 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb"] Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.142917 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.142965 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.143010 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.143031 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7mx\" (UniqueName: \"kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.143075 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.143170 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.244705 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.245481 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.245519 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.245574 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.245605 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7mx\" (UniqueName: \"kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.245674 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.252438 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.252573 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.252901 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.253126 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.253576 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.263216 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7mx\" (UniqueName: \"kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.290182 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.808967 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:42:02 crc kubenswrapper[4959]: E1007 13:42:02.809491 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:42:02 crc kubenswrapper[4959]: I1007 13:42:02.899442 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb"] Oct 07 13:42:03 crc kubenswrapper[4959]: I1007 13:42:03.889795 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" event={"ID":"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3","Type":"ContainerStarted","Data":"184ecd511c5f4acb31ba420831155f600d59e669a78964eb234668349fdd1440"} Oct 07 13:42:04 crc kubenswrapper[4959]: I1007 13:42:04.898762 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" event={"ID":"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3","Type":"ContainerStarted","Data":"22c624abea3fdccf5b1f95d64cdea74cf201ea060d4bf7c6ceae0c85d82529a7"} Oct 07 13:42:04 crc kubenswrapper[4959]: I1007 13:42:04.922127 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" podStartSLOduration=2.871905468 podStartE2EDuration="3.922108758s" podCreationTimestamp="2025-10-07 13:42:01 +0000 UTC" firstStartedPulling="2025-10-07 13:42:02.911831866 +0000 UTC m=+2475.072554543" lastFinishedPulling="2025-10-07 13:42:03.962035156 +0000 UTC m=+2476.122757833" observedRunningTime="2025-10-07 13:42:04.920296196 +0000 UTC m=+2477.081018883" watchObservedRunningTime="2025-10-07 13:42:04.922108758 +0000 UTC m=+2477.082831435" Oct 07 13:42:14 crc kubenswrapper[4959]: I1007 13:42:14.809209 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:42:14 crc kubenswrapper[4959]: E1007 13:42:14.810038 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:42:29 crc kubenswrapper[4959]: I1007 13:42:29.809040 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:42:29 crc kubenswrapper[4959]: E1007 13:42:29.810856 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:42:43 crc kubenswrapper[4959]: I1007 13:42:43.809353 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:42:43 crc kubenswrapper[4959]: E1007 13:42:43.810114 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:42:55 crc kubenswrapper[4959]: I1007 13:42:55.809078 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:42:55 crc kubenswrapper[4959]: E1007 13:42:55.809851 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:43:09 crc kubenswrapper[4959]: I1007 13:43:09.808468 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:43:09 crc kubenswrapper[4959]: E1007 13:43:09.809349 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:43:22 crc kubenswrapper[4959]: I1007 13:43:22.809751 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:43:22 crc kubenswrapper[4959]: E1007 13:43:22.810736 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:43:36 crc kubenswrapper[4959]: I1007 13:43:36.809798 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:43:36 crc kubenswrapper[4959]: E1007 13:43:36.810581 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:43:47 crc kubenswrapper[4959]: I1007 13:43:47.808890 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:43:47 crc kubenswrapper[4959]: E1007 13:43:47.809749 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:43:59 crc kubenswrapper[4959]: I1007 13:43:59.813253 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:43:59 crc kubenswrapper[4959]: E1007 13:43:59.814100 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:44:14 crc kubenswrapper[4959]: I1007 13:44:14.809352 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:44:14 crc kubenswrapper[4959]: E1007 13:44:14.809980 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:44:27 crc kubenswrapper[4959]: I1007 13:44:27.809372 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:44:27 crc kubenswrapper[4959]: E1007 13:44:27.810194 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:44:38 crc kubenswrapper[4959]: I1007 13:44:38.815356 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:44:39 crc kubenswrapper[4959]: I1007 13:44:39.192608 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d"} Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.161467 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9"] Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.163461 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.167181 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.167586 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.171550 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9"] Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.293818 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.293876 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lnc\" (UniqueName: \"kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.293902 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.396153 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lnc\" (UniqueName: \"kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.396203 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.396353 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.397210 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.404109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.413883 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lnc\" (UniqueName: \"kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc\") pod \"collect-profiles-29330745-pdqc9\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.494701 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:00 crc kubenswrapper[4959]: I1007 13:45:00.910906 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9"] Oct 07 13:45:01 crc kubenswrapper[4959]: I1007 13:45:01.381716 4959 generic.go:334] "Generic (PLEG): container finished" podID="ba6a560c-ff3d-432a-8db0-c51d43ce4082" containerID="2b1dac7f04b7d257a085580a3e39894532d2f35eaa2de437d9a8685fb1a766ab" exitCode=0 Oct 07 13:45:01 crc kubenswrapper[4959]: I1007 13:45:01.381782 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" event={"ID":"ba6a560c-ff3d-432a-8db0-c51d43ce4082","Type":"ContainerDied","Data":"2b1dac7f04b7d257a085580a3e39894532d2f35eaa2de437d9a8685fb1a766ab"} Oct 07 13:45:01 crc kubenswrapper[4959]: I1007 13:45:01.382012 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" event={"ID":"ba6a560c-ff3d-432a-8db0-c51d43ce4082","Type":"ContainerStarted","Data":"d6e728b6e73b806ad1994459d366e834833140bd7b6c8871ef40b75ad3eb1aa0"} Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.724299 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.840551 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume\") pod \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.840673 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume\") pod \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.840730 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lnc\" (UniqueName: \"kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc\") pod \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\" (UID: \"ba6a560c-ff3d-432a-8db0-c51d43ce4082\") " Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.841559 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba6a560c-ff3d-432a-8db0-c51d43ce4082" (UID: "ba6a560c-ff3d-432a-8db0-c51d43ce4082"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.847357 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba6a560c-ff3d-432a-8db0-c51d43ce4082" (UID: "ba6a560c-ff3d-432a-8db0-c51d43ce4082"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.847390 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc" (OuterVolumeSpecName: "kube-api-access-26lnc") pod "ba6a560c-ff3d-432a-8db0-c51d43ce4082" (UID: "ba6a560c-ff3d-432a-8db0-c51d43ce4082"). InnerVolumeSpecName "kube-api-access-26lnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.943591 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba6a560c-ff3d-432a-8db0-c51d43ce4082-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.943621 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba6a560c-ff3d-432a-8db0-c51d43ce4082-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:02 crc kubenswrapper[4959]: I1007 13:45:02.943648 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lnc\" (UniqueName: \"kubernetes.io/projected/ba6a560c-ff3d-432a-8db0-c51d43ce4082-kube-api-access-26lnc\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:03 crc kubenswrapper[4959]: I1007 13:45:03.408441 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" event={"ID":"ba6a560c-ff3d-432a-8db0-c51d43ce4082","Type":"ContainerDied","Data":"d6e728b6e73b806ad1994459d366e834833140bd7b6c8871ef40b75ad3eb1aa0"} Oct 07 13:45:03 crc kubenswrapper[4959]: I1007 13:45:03.408529 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9" Oct 07 13:45:03 crc kubenswrapper[4959]: I1007 13:45:03.408590 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e728b6e73b806ad1994459d366e834833140bd7b6c8871ef40b75ad3eb1aa0" Oct 07 13:45:03 crc kubenswrapper[4959]: I1007 13:45:03.798033 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr"] Oct 07 13:45:03 crc kubenswrapper[4959]: I1007 13:45:03.805403 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-r76vr"] Oct 07 13:45:04 crc kubenswrapper[4959]: I1007 13:45:04.821651 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b487cdd-8a08-4621-9259-567d66d5cc06" path="/var/lib/kubelet/pods/5b487cdd-8a08-4621-9259-567d66d5cc06/volumes" Oct 07 13:45:54 crc kubenswrapper[4959]: I1007 13:45:54.291504 4959 scope.go:117] "RemoveContainer" containerID="8457f7e55a2f6fd8e487dda8cc7d95501d75f8ae2d3da336170d1cd0fc638bb7" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.637833 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:45:59 crc kubenswrapper[4959]: E1007 13:45:59.639865 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a560c-ff3d-432a-8db0-c51d43ce4082" containerName="collect-profiles" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.639896 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a560c-ff3d-432a-8db0-c51d43ce4082" containerName="collect-profiles" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.640225 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6a560c-ff3d-432a-8db0-c51d43ce4082" containerName="collect-profiles" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.642007 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.653012 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.772869 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.773247 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.773388 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2wf\" (UniqueName: \"kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.875815 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.875971 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2wf\" (UniqueName: \"kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.876030 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.876307 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.876470 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.905671 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2wf\" (UniqueName: \"kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf\") pod \"certified-operators-5r5rk\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:45:59 crc kubenswrapper[4959]: I1007 13:45:59.979476 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:00 crc kubenswrapper[4959]: I1007 13:46:00.489975 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:46:00 crc kubenswrapper[4959]: I1007 13:46:00.885649 4959 generic.go:334] "Generic (PLEG): container finished" podID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerID="4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01" exitCode=0 Oct 07 13:46:00 crc kubenswrapper[4959]: I1007 13:46:00.885871 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerDied","Data":"4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01"} Oct 07 13:46:00 crc kubenswrapper[4959]: I1007 13:46:00.886110 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerStarted","Data":"89d1c719a959ac2a50d7777cd448f8756de32cd2005ce954f14a7012713d7026"} Oct 07 13:46:00 crc kubenswrapper[4959]: I1007 13:46:00.887798 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:46:01 crc kubenswrapper[4959]: I1007 13:46:01.897233 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerStarted","Data":"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128"} Oct 07 13:46:02 crc kubenswrapper[4959]: I1007 13:46:02.906215 4959 generic.go:334] "Generic (PLEG): container finished" podID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerID="cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128" exitCode=0 Oct 07 13:46:02 crc kubenswrapper[4959]: I1007 13:46:02.906256 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerDied","Data":"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128"} Oct 07 13:46:03 crc kubenswrapper[4959]: I1007 13:46:03.916368 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerStarted","Data":"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe"} Oct 07 13:46:03 crc kubenswrapper[4959]: I1007 13:46:03.937354 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5r5rk" podStartSLOduration=2.522476754 podStartE2EDuration="4.937331998s" podCreationTimestamp="2025-10-07 13:45:59 +0000 UTC" firstStartedPulling="2025-10-07 13:46:00.887578047 +0000 UTC m=+2713.048300714" lastFinishedPulling="2025-10-07 13:46:03.302433281 +0000 UTC m=+2715.463155958" observedRunningTime="2025-10-07 13:46:03.932960114 +0000 UTC m=+2716.093682811" watchObservedRunningTime="2025-10-07 13:46:03.937331998 +0000 UTC m=+2716.098054675" Oct 07 13:46:09 crc kubenswrapper[4959]: I1007 13:46:09.979677 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:09 crc kubenswrapper[4959]: I1007 13:46:09.980432 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:10 crc kubenswrapper[4959]: I1007 13:46:10.050023 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:11 crc kubenswrapper[4959]: I1007 13:46:11.046913 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:11 crc kubenswrapper[4959]: I1007 13:46:11.105810 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.032131 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5r5rk" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="registry-server" containerID="cri-o://24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe" gracePeriod=2 Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.490712 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.669583 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content\") pod \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.669744 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r2wf\" (UniqueName: \"kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf\") pod \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.669820 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities\") pod \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\" (UID: \"fb9c9506-a7c8-4a51-9e6c-198d0020693a\") " Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.671062 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities" (OuterVolumeSpecName: "utilities") pod "fb9c9506-a7c8-4a51-9e6c-198d0020693a" (UID: "fb9c9506-a7c8-4a51-9e6c-198d0020693a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.679815 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf" (OuterVolumeSpecName: "kube-api-access-8r2wf") pod "fb9c9506-a7c8-4a51-9e6c-198d0020693a" (UID: "fb9c9506-a7c8-4a51-9e6c-198d0020693a"). InnerVolumeSpecName "kube-api-access-8r2wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.722657 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb9c9506-a7c8-4a51-9e6c-198d0020693a" (UID: "fb9c9506-a7c8-4a51-9e6c-198d0020693a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.772795 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r2wf\" (UniqueName: \"kubernetes.io/projected/fb9c9506-a7c8-4a51-9e6c-198d0020693a-kube-api-access-8r2wf\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.772857 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:13 crc kubenswrapper[4959]: I1007 13:46:13.772867 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9c9506-a7c8-4a51-9e6c-198d0020693a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.043453 4959 generic.go:334] "Generic (PLEG): container finished" podID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerID="24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe" exitCode=0 Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.043503 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerDied","Data":"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe"} Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.043563 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r5rk" event={"ID":"fb9c9506-a7c8-4a51-9e6c-198d0020693a","Type":"ContainerDied","Data":"89d1c719a959ac2a50d7777cd448f8756de32cd2005ce954f14a7012713d7026"} Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.043587 4959 scope.go:117] "RemoveContainer" containerID="24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.043602 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r5rk" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.063646 4959 scope.go:117] "RemoveContainer" containerID="cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.094603 4959 scope.go:117] "RemoveContainer" containerID="4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.151055 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.154766 4959 scope.go:117] "RemoveContainer" containerID="24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe" Oct 07 13:46:14 crc kubenswrapper[4959]: E1007 13:46:14.155750 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe\": container with ID starting with 24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe not found: ID does not exist" containerID="24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.155793 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe"} err="failed to get container status \"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe\": rpc error: code = NotFound desc = could not find container \"24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe\": container with ID starting with 24e6df380f54b742f8e3ee4e586dfbefdfeee3e7db26cebece737093d4521bbe not found: ID does not exist" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.155819 4959 scope.go:117] "RemoveContainer" containerID="cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128" Oct 07 13:46:14 crc kubenswrapper[4959]: E1007 13:46:14.156531 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128\": container with ID starting with cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128 not found: ID does not exist" containerID="cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.156561 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128"} err="failed to get container status \"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128\": rpc error: code = NotFound desc = could not find container \"cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128\": container with ID starting with cf06c4d4884b4328b9fe69f6d1a2bb675f93b87f535a5d0debd923aae4643128 not found: ID does not exist" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.156577 4959 scope.go:117] "RemoveContainer" containerID="4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01" Oct 07 13:46:14 crc kubenswrapper[4959]: E1007 13:46:14.156898 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01\": container with ID starting with 4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01 not found: ID does not exist" containerID="4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.156918 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01"} err="failed to get container status \"4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01\": rpc error: code = NotFound desc = could not find container \"4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01\": container with ID starting with 4a266915e8113aa46025ea488f8007453a11bdf13d54a79c61568e327956ed01 not found: ID does not exist" Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.159558 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5r5rk"] Oct 07 13:46:14 crc kubenswrapper[4959]: I1007 13:46:14.823440 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" path="/var/lib/kubelet/pods/fb9c9506-a7c8-4a51-9e6c-198d0020693a/volumes" Oct 07 13:47:07 crc kubenswrapper[4959]: I1007 13:47:07.695902 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:47:07 crc kubenswrapper[4959]: I1007 13:47:07.697714 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:47:22 crc kubenswrapper[4959]: I1007 13:47:22.633878 4959 generic.go:334] "Generic (PLEG): container finished" podID="dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" containerID="22c624abea3fdccf5b1f95d64cdea74cf201ea060d4bf7c6ceae0c85d82529a7" exitCode=0 Oct 07 13:47:22 crc kubenswrapper[4959]: I1007 13:47:22.633942 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" event={"ID":"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3","Type":"ContainerDied","Data":"22c624abea3fdccf5b1f95d64cdea74cf201ea060d4bf7c6ceae0c85d82529a7"} Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.055106 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.161293 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.161680 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7mx\" (UniqueName: \"kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.161844 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.161897 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.161952 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.162041 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.168005 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.168156 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx" (OuterVolumeSpecName: "kube-api-access-np7mx") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "kube-api-access-np7mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.168263 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph" (OuterVolumeSpecName: "ceph") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: E1007 13:47:24.188259 4959 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory podName:dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3 nodeName:}" failed. No retries permitted until 2025-10-07 13:47:24.688228596 +0000 UTC m=+2796.848951273 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3") : error deleting /var/lib/kubelet/pods/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3/volume-subpaths: remove /var/lib/kubelet/pods/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3/volume-subpaths: no such file or directory Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.190871 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.191279 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.263882 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7mx\" (UniqueName: \"kubernetes.io/projected/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-kube-api-access-np7mx\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.263919 4959 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.263929 4959 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.263938 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.263947 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.655177 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" event={"ID":"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3","Type":"ContainerDied","Data":"184ecd511c5f4acb31ba420831155f600d59e669a78964eb234668349fdd1440"} Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.655252 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="184ecd511c5f4acb31ba420831155f600d59e669a78964eb234668349fdd1440" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.655287 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745349 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw"] Oct 07 13:47:24 crc kubenswrapper[4959]: E1007 13:47:24.745707 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="extract-utilities" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745722 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="extract-utilities" Oct 07 13:47:24 crc kubenswrapper[4959]: E1007 13:47:24.745749 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="registry-server" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745756 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="registry-server" Oct 07 13:47:24 crc kubenswrapper[4959]: E1007 13:47:24.745764 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745771 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 13:47:24 crc kubenswrapper[4959]: E1007 13:47:24.745787 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="extract-content" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745793 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="extract-content" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745948 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.745963 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c9506-a7c8-4a51-9e6c-198d0020693a" containerName="registry-server" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.746618 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.748685 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.748747 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.749717 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.749870 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.758661 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw"] Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.771305 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") pod \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\" (UID: \"dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3\") " Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.775899 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory" (OuterVolumeSpecName: "inventory") pod "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3" (UID: "dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873037 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873095 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873146 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873242 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873323 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873373 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jdl\" (UniqueName: \"kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873442 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873475 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873524 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873559 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873588 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.873701 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975291 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975727 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jdl\" (UniqueName: \"kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975773 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975803 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975849 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975872 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975891 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975938 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975960 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.975991 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.976067 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.976928 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.977189 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.979678 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.979948 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.981618 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.981789 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.982243 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.983430 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.985272 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.991312 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:24 crc kubenswrapper[4959]: I1007 13:47:24.993672 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jdl\" (UniqueName: \"kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:25 crc kubenswrapper[4959]: I1007 13:47:25.075786 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:47:25 crc kubenswrapper[4959]: I1007 13:47:25.580605 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw"] Oct 07 13:47:25 crc kubenswrapper[4959]: I1007 13:47:25.663079 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" event={"ID":"0ebc66fe-ebad-47d5-93df-fbff665959d9","Type":"ContainerStarted","Data":"8447452b3bf5118a2f7c3b4fd1efabe914c1f98eed5955e01451f8c29f7a56d2"} Oct 07 13:47:27 crc kubenswrapper[4959]: I1007 13:47:27.679972 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" event={"ID":"0ebc66fe-ebad-47d5-93df-fbff665959d9","Type":"ContainerStarted","Data":"47bea9693b13a46811fd0665aaade9a031b2ae5eed282e6b451bc3b63d292728"} Oct 07 13:47:27 crc kubenswrapper[4959]: I1007 13:47:27.701241 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" podStartSLOduration=2.84195889 podStartE2EDuration="3.701224746s" podCreationTimestamp="2025-10-07 13:47:24 +0000 UTC" firstStartedPulling="2025-10-07 13:47:25.581223815 +0000 UTC m=+2797.741946502" lastFinishedPulling="2025-10-07 13:47:26.440489681 +0000 UTC m=+2798.601212358" observedRunningTime="2025-10-07 13:47:27.696027079 +0000 UTC m=+2799.856749756" watchObservedRunningTime="2025-10-07 13:47:27.701224746 +0000 UTC m=+2799.861947423" Oct 07 13:47:37 crc kubenswrapper[4959]: I1007 13:47:37.695711 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:47:37 crc kubenswrapper[4959]: I1007 13:47:37.696398 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.696166 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.697224 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.697325 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.698730 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.698827 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d" gracePeriod=600 Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.998816 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d" exitCode=0 Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.999049 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d"} Oct 07 13:48:07 crc kubenswrapper[4959]: I1007 13:48:07.999243 4959 scope.go:117] "RemoveContainer" containerID="2a3730c316618950bba78aa39c99029b7e77f7d65cdcbb44d565b687c49658a3" Oct 07 13:48:09 crc kubenswrapper[4959]: I1007 13:48:09.011513 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe"} Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.784448 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.789278 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.802641 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.843405 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.843461 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.843542 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8vn\" (UniqueName: \"kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.944830 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh8vn\" (UniqueName: \"kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.945001 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.945028 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.945472 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.945597 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:13 crc kubenswrapper[4959]: I1007 13:48:13.963749 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh8vn\" (UniqueName: \"kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn\") pod \"redhat-operators-28kfz\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:14 crc kubenswrapper[4959]: I1007 13:48:14.110508 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:14 crc kubenswrapper[4959]: I1007 13:48:14.552948 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:15 crc kubenswrapper[4959]: I1007 13:48:15.059257 4959 generic.go:334] "Generic (PLEG): container finished" podID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerID="1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d" exitCode=0 Oct 07 13:48:15 crc kubenswrapper[4959]: I1007 13:48:15.059304 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerDied","Data":"1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d"} Oct 07 13:48:15 crc kubenswrapper[4959]: I1007 13:48:15.059331 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerStarted","Data":"ed869f68bd5d0a971d48a3e9f300d3d1f8d2796ffc078d46d1c9c4baf4f116e6"} Oct 07 13:48:16 crc kubenswrapper[4959]: I1007 13:48:16.069415 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerStarted","Data":"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b"} Oct 07 13:48:17 crc kubenswrapper[4959]: I1007 13:48:17.090272 4959 generic.go:334] "Generic (PLEG): container finished" podID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerID="c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b" exitCode=0 Oct 07 13:48:17 crc kubenswrapper[4959]: I1007 13:48:17.090347 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerDied","Data":"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b"} Oct 07 13:48:18 crc kubenswrapper[4959]: I1007 13:48:18.101223 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerStarted","Data":"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb"} Oct 07 13:48:18 crc kubenswrapper[4959]: I1007 13:48:18.120447 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-28kfz" podStartSLOduration=2.418861532 podStartE2EDuration="5.120433449s" podCreationTimestamp="2025-10-07 13:48:13 +0000 UTC" firstStartedPulling="2025-10-07 13:48:15.060867571 +0000 UTC m=+2847.221590248" lastFinishedPulling="2025-10-07 13:48:17.762439498 +0000 UTC m=+2849.923162165" observedRunningTime="2025-10-07 13:48:18.117569928 +0000 UTC m=+2850.278292625" watchObservedRunningTime="2025-10-07 13:48:18.120433449 +0000 UTC m=+2850.281156126" Oct 07 13:48:24 crc kubenswrapper[4959]: I1007 13:48:24.111588 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:24 crc kubenswrapper[4959]: I1007 13:48:24.112203 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:24 crc kubenswrapper[4959]: I1007 13:48:24.161845 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:24 crc kubenswrapper[4959]: I1007 13:48:24.206436 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:24 crc kubenswrapper[4959]: I1007 13:48:24.396599 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.164252 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-28kfz" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="registry-server" containerID="cri-o://8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb" gracePeriod=2 Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.585215 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.671910 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities\") pod \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.672045 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh8vn\" (UniqueName: \"kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn\") pod \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.672086 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content\") pod \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\" (UID: \"9e3a987e-2fca-46a9-a760-4c25e44d23ba\") " Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.673114 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities" (OuterVolumeSpecName: "utilities") pod "9e3a987e-2fca-46a9-a760-4c25e44d23ba" (UID: "9e3a987e-2fca-46a9-a760-4c25e44d23ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.676903 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn" (OuterVolumeSpecName: "kube-api-access-dh8vn") pod "9e3a987e-2fca-46a9-a760-4c25e44d23ba" (UID: "9e3a987e-2fca-46a9-a760-4c25e44d23ba"). InnerVolumeSpecName "kube-api-access-dh8vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.774302 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:48:26 crc kubenswrapper[4959]: I1007 13:48:26.774661 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh8vn\" (UniqueName: \"kubernetes.io/projected/9e3a987e-2fca-46a9-a760-4c25e44d23ba-kube-api-access-dh8vn\") on node \"crc\" DevicePath \"\"" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.185110 4959 generic.go:334] "Generic (PLEG): container finished" podID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerID="8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb" exitCode=0 Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.185156 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerDied","Data":"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb"} Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.185185 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28kfz" event={"ID":"9e3a987e-2fca-46a9-a760-4c25e44d23ba","Type":"ContainerDied","Data":"ed869f68bd5d0a971d48a3e9f300d3d1f8d2796ffc078d46d1c9c4baf4f116e6"} Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.185204 4959 scope.go:117] "RemoveContainer" containerID="8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.185341 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28kfz" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.207047 4959 scope.go:117] "RemoveContainer" containerID="c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.227486 4959 scope.go:117] "RemoveContainer" containerID="1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.270554 4959 scope.go:117] "RemoveContainer" containerID="8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb" Oct 07 13:48:27 crc kubenswrapper[4959]: E1007 13:48:27.271002 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb\": container with ID starting with 8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb not found: ID does not exist" containerID="8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.271051 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb"} err="failed to get container status \"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb\": rpc error: code = NotFound desc = could not find container \"8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb\": container with ID starting with 8069068dd0407bfbf33fe231d367caf6fe29f986296d1d30d3699753aced5deb not found: ID does not exist" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.271078 4959 scope.go:117] "RemoveContainer" containerID="c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b" Oct 07 13:48:27 crc kubenswrapper[4959]: E1007 13:48:27.271529 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b\": container with ID starting with c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b not found: ID does not exist" containerID="c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.271557 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b"} err="failed to get container status \"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b\": rpc error: code = NotFound desc = could not find container \"c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b\": container with ID starting with c97452f6004cfca69f0668b81e67475777618331ae30b12174a59732dfdea06b not found: ID does not exist" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.271575 4959 scope.go:117] "RemoveContainer" containerID="1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d" Oct 07 13:48:27 crc kubenswrapper[4959]: E1007 13:48:27.271994 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d\": container with ID starting with 1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d not found: ID does not exist" containerID="1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.272045 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d"} err="failed to get container status \"1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d\": rpc error: code = NotFound desc = could not find container \"1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d\": container with ID starting with 1e5a2e2c9d1ee68e3b3395d3bb7612b73b1a30baa30a5ed853f561c016af266d not found: ID does not exist" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.398102 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e3a987e-2fca-46a9-a760-4c25e44d23ba" (UID: "9e3a987e-2fca-46a9-a760-4c25e44d23ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.487052 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3a987e-2fca-46a9-a760-4c25e44d23ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.527910 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:27 crc kubenswrapper[4959]: I1007 13:48:27.534469 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-28kfz"] Oct 07 13:48:28 crc kubenswrapper[4959]: I1007 13:48:28.823087 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" path="/var/lib/kubelet/pods/9e3a987e-2fca-46a9-a760-4c25e44d23ba/volumes" Oct 07 13:50:37 crc kubenswrapper[4959]: I1007 13:50:37.695914 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:50:37 crc kubenswrapper[4959]: I1007 13:50:37.696464 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:51:07 crc kubenswrapper[4959]: I1007 13:51:07.696604 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:51:07 crc kubenswrapper[4959]: I1007 13:51:07.697651 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:51:09 crc kubenswrapper[4959]: I1007 13:51:09.610890 4959 generic.go:334] "Generic (PLEG): container finished" podID="0ebc66fe-ebad-47d5-93df-fbff665959d9" containerID="47bea9693b13a46811fd0665aaade9a031b2ae5eed282e6b451bc3b63d292728" exitCode=0 Oct 07 13:51:09 crc kubenswrapper[4959]: I1007 13:51:09.610959 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" event={"ID":"0ebc66fe-ebad-47d5-93df-fbff665959d9","Type":"ContainerDied","Data":"47bea9693b13a46811fd0665aaade9a031b2ae5eed282e6b451bc3b63d292728"} Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.010555 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143042 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143172 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143239 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143313 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143380 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143433 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143557 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143689 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2jdl\" (UniqueName: \"kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143776 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143858 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.143959 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory\") pod \"0ebc66fe-ebad-47d5-93df-fbff665959d9\" (UID: \"0ebc66fe-ebad-47d5-93df-fbff665959d9\") " Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.150809 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl" (OuterVolumeSpecName: "kube-api-access-j2jdl") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "kube-api-access-j2jdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.151906 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.153095 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph" (OuterVolumeSpecName: "ceph") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.172956 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.174404 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.175836 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory" (OuterVolumeSpecName: "inventory") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.178556 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.178890 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.180945 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.181066 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.185149 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0ebc66fe-ebad-47d5-93df-fbff665959d9" (UID: "0ebc66fe-ebad-47d5-93df-fbff665959d9"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.247788 4959 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248030 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248159 4959 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248298 4959 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248397 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248485 4959 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248577 4959 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248686 4959 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248751 4959 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248816 4959 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0ebc66fe-ebad-47d5-93df-fbff665959d9-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.248881 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2jdl\" (UniqueName: \"kubernetes.io/projected/0ebc66fe-ebad-47d5-93df-fbff665959d9-kube-api-access-j2jdl\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.628598 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" event={"ID":"0ebc66fe-ebad-47d5-93df-fbff665959d9","Type":"ContainerDied","Data":"8447452b3bf5118a2f7c3b4fd1efabe914c1f98eed5955e01451f8c29f7a56d2"} Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.628673 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8447452b3bf5118a2f7c3b4fd1efabe914c1f98eed5955e01451f8c29f7a56d2" Oct 07 13:51:11 crc kubenswrapper[4959]: I1007 13:51:11.628681 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.898755 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 13:51:25 crc kubenswrapper[4959]: E1007 13:51:25.899920 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="extract-content" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.899939 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="extract-content" Oct 07 13:51:25 crc kubenswrapper[4959]: E1007 13:51:25.900005 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="registry-server" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.900017 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="registry-server" Oct 07 13:51:25 crc kubenswrapper[4959]: E1007 13:51:25.900039 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="extract-utilities" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.900048 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="extract-utilities" Oct 07 13:51:25 crc kubenswrapper[4959]: E1007 13:51:25.900066 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebc66fe-ebad-47d5-93df-fbff665959d9" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.900076 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebc66fe-ebad-47d5-93df-fbff665959d9" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.900277 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebc66fe-ebad-47d5-93df-fbff665959d9" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.900332 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3a987e-2fca-46a9-a760-4c25e44d23ba" containerName="registry-server" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.901956 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.903684 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.903804 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.922850 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.968071 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.973582 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.975580 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 07 13:51:25 crc kubenswrapper[4959]: I1007 13:51:25.998236 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027048 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027128 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-dev\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027160 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-run\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027198 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-sys\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027229 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027253 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027272 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027510 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027563 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027667 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027761 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027841 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.027883 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.028049 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.028152 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flm9h\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-kube-api-access-flm9h\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.028176 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.129989 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130281 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130384 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130485 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130563 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130666 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130602 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130852 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-scripts\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.130939 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs72z\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-kube-api-access-bs72z\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131035 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131105 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131186 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131291 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flm9h\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-kube-api-access-flm9h\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131374 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-run\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131303 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131451 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131580 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131578 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-ceph\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131759 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131949 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-dev\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132031 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132146 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-run\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132223 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132315 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-sys\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132377 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-sys\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131964 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.131989 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-dev\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132231 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-run\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132554 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-lib-modules\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132649 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132731 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-sys\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132838 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.132931 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133029 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-dev\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133057 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133115 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133195 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133230 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133244 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133282 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.133446 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef8431b3-9196-4986-aba7-43ffefa14817-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.136100 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.136151 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.136910 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.138205 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.138446 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8431b3-9196-4986-aba7-43ffefa14817-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.161488 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flm9h\" (UniqueName: \"kubernetes.io/projected/ef8431b3-9196-4986-aba7-43ffefa14817-kube-api-access-flm9h\") pod \"cinder-volume-volume1-0\" (UID: \"ef8431b3-9196-4986-aba7-43ffefa14817\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.229001 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235116 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-lib-modules\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235177 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-sys\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235210 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-dev\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235237 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235244 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-lib-modules\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235271 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235311 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-dev\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235350 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-sys\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235367 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235405 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235414 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235440 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235468 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-scripts\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs72z\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-kube-api-access-bs72z\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235565 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235585 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235713 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-run\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235772 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-ceph\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235807 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235875 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.235934 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.236075 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.236109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-run\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.237254 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.237440 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.237649 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dcdce3c-0b57-4c61-84d9-61c99ba03314-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.240407 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-scripts\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.240815 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.240846 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-ceph\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.241503 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-config-data\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.241804 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcdce3c-0b57-4c61-84d9-61c99ba03314-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.252379 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs72z\" (UniqueName: \"kubernetes.io/projected/3dcdce3c-0b57-4c61-84d9-61c99ba03314-kube-api-access-bs72z\") pod \"cinder-backup-0\" (UID: \"3dcdce3c-0b57-4c61-84d9-61c99ba03314\") " pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.292828 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.482334 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-svqh8"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.484128 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svqh8" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.499621 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svqh8"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.549119 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xdr\" (UniqueName: \"kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr\") pod \"manila-db-create-svqh8\" (UID: \"9d44e226-eb49-498b-a3c3-4fef79b4123e\") " pod="openstack/manila-db-create-svqh8" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.650963 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77xdr\" (UniqueName: \"kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr\") pod \"manila-db-create-svqh8\" (UID: \"9d44e226-eb49-498b-a3c3-4fef79b4123e\") " pod="openstack/manila-db-create-svqh8" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.667145 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xdr\" (UniqueName: \"kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr\") pod \"manila-db-create-svqh8\" (UID: \"9d44e226-eb49-498b-a3c3-4fef79b4123e\") " pod="openstack/manila-db-create-svqh8" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.740969 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.742442 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.745978 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fvw77" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.746997 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.748441 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.748692 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.758846 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.807820 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.809169 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svqh8" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.809414 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.812772 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.813177 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.823675 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.855780 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856088 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856116 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctfp\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-kube-api-access-dctfp\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856165 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856185 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856281 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856364 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856422 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-logs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.856712 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.879260 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.894097 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.938242 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 13:51:26 crc kubenswrapper[4959]: W1007 13:51:26.947481 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dcdce3c_0b57_4c61_84d9_61c99ba03314.slice/crio-fbd56f5e425336f49f270c281cf20fa2814d1ccfd4b74c2d02842c1722e311ef WatchSource:0}: Error finding container fbd56f5e425336f49f270c281cf20fa2814d1ccfd4b74c2d02842c1722e311ef: Status 404 returned error can't find the container with id fbd56f5e425336f49f270c281cf20fa2814d1ccfd4b74c2d02842c1722e311ef Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.957980 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958049 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958070 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4br\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-kube-api-access-9n4br\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958299 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958346 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958368 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958399 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-logs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958447 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958586 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958610 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958645 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctfp\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-kube-api-access-dctfp\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958732 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958753 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.958772 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.960041 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.960346 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.960435 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.962104 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ff234e-dd31-4847-8517-4befe98845f7-logs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.963163 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.964303 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.964569 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.966867 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.977100 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.986491 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctfp\" (UniqueName: \"kubernetes.io/projected/77ff234e-dd31-4847-8517-4befe98845f7-kube-api-access-dctfp\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:26 crc kubenswrapper[4959]: I1007 13:51:26.989420 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff234e-dd31-4847-8517-4befe98845f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.002798 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"77ff234e-dd31-4847-8517-4befe98845f7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.065664 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066413 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066442 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066463 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066160 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066503 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066534 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066578 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4br\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-kube-api-access-9n4br\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066620 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.066671 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.067059 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.067135 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc40f402-6581-45a7-945f-a64d217724ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.067164 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.070195 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.071419 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.073682 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.074073 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc40f402-6581-45a7-945f-a64d217724ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.074617 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.084256 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4br\" (UniqueName: \"kubernetes.io/projected/cc40f402-6581-45a7-945f-a64d217724ab-kube-api-access-9n4br\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.102640 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc40f402-6581-45a7-945f-a64d217724ab\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.131906 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.280779 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svqh8"] Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.632528 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.735858 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.772381 4959 generic.go:334] "Generic (PLEG): container finished" podID="9d44e226-eb49-498b-a3c3-4fef79b4123e" containerID="d0492eb4bbfff77a4f362e04d75722eef0005fc9e354f764a9f28800cd4e9d89" exitCode=0 Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.772459 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svqh8" event={"ID":"9d44e226-eb49-498b-a3c3-4fef79b4123e","Type":"ContainerDied","Data":"d0492eb4bbfff77a4f362e04d75722eef0005fc9e354f764a9f28800cd4e9d89"} Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.772489 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svqh8" event={"ID":"9d44e226-eb49-498b-a3c3-4fef79b4123e","Type":"ContainerStarted","Data":"27e50214e22973dfa1aa8bddd52a24f2dfe8b9a0ce414ceb7dc383cab891f3a1"} Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.773779 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77ff234e-dd31-4847-8517-4befe98845f7","Type":"ContainerStarted","Data":"e00f1c7536935a3bee0cb709327d16af87225f845044432081819ae17044fc2c"} Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.775751 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3dcdce3c-0b57-4c61-84d9-61c99ba03314","Type":"ContainerStarted","Data":"fbd56f5e425336f49f270c281cf20fa2814d1ccfd4b74c2d02842c1722e311ef"} Oct 07 13:51:27 crc kubenswrapper[4959]: I1007 13:51:27.777457 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"ef8431b3-9196-4986-aba7-43ffefa14817","Type":"ContainerStarted","Data":"772f23fb5abbdc8f965f0e5ce6234a8654f8f4ced8c8fd8faf0b8e47db07bd53"} Oct 07 13:51:27 crc kubenswrapper[4959]: W1007 13:51:27.850803 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc40f402_6581_45a7_945f_a64d217724ab.slice/crio-e4a86387489c7ef37523ea8ddfae3ccad8090bec5de32f1af851067f50b9be99 WatchSource:0}: Error finding container e4a86387489c7ef37523ea8ddfae3ccad8090bec5de32f1af851067f50b9be99: Status 404 returned error can't find the container with id e4a86387489c7ef37523ea8ddfae3ccad8090bec5de32f1af851067f50b9be99 Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.790806 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77ff234e-dd31-4847-8517-4befe98845f7","Type":"ContainerStarted","Data":"4d5ace4bb2b59701e623c503c5c72f509d32cafacee650d2c9b7789819e5026a"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.793055 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc40f402-6581-45a7-945f-a64d217724ab","Type":"ContainerStarted","Data":"75ea3b839e85bc8c7f82d3e2644922d93713e35de8716ff32d81a254a0b3ae0a"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.793091 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc40f402-6581-45a7-945f-a64d217724ab","Type":"ContainerStarted","Data":"e4a86387489c7ef37523ea8ddfae3ccad8090bec5de32f1af851067f50b9be99"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.794881 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3dcdce3c-0b57-4c61-84d9-61c99ba03314","Type":"ContainerStarted","Data":"d4a0e9364435a57099da30832fb912a2b3435c241de21ae50eb2b38273095538"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.794907 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3dcdce3c-0b57-4c61-84d9-61c99ba03314","Type":"ContainerStarted","Data":"c9d0ff58540257bfcd33dcff04259d11cea92e17564df8efd4321e0be06e6fcd"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.799791 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"ef8431b3-9196-4986-aba7-43ffefa14817","Type":"ContainerStarted","Data":"4444e6a06fb3b325110f0dfe80097efd9c5d0e90699817af6c6d5439cf3bd207"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.799838 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"ef8431b3-9196-4986-aba7-43ffefa14817","Type":"ContainerStarted","Data":"7d13cda063d40f1c6c5156de8c1693ab303209c9f35aebe53fe35238956205e7"} Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.835791 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.124928436 podStartE2EDuration="3.83577292s" podCreationTimestamp="2025-10-07 13:51:25 +0000 UTC" firstStartedPulling="2025-10-07 13:51:26.949471433 +0000 UTC m=+3039.110194110" lastFinishedPulling="2025-10-07 13:51:27.660315917 +0000 UTC m=+3039.821038594" observedRunningTime="2025-10-07 13:51:28.827013412 +0000 UTC m=+3040.987736109" watchObservedRunningTime="2025-10-07 13:51:28.83577292 +0000 UTC m=+3040.996495597" Oct 07 13:51:28 crc kubenswrapper[4959]: I1007 13:51:28.852734 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.799224878 podStartE2EDuration="3.852715399s" podCreationTimestamp="2025-10-07 13:51:25 +0000 UTC" firstStartedPulling="2025-10-07 13:51:26.893679174 +0000 UTC m=+3039.054401851" lastFinishedPulling="2025-10-07 13:51:27.947169695 +0000 UTC m=+3040.107892372" observedRunningTime="2025-10-07 13:51:28.850501107 +0000 UTC m=+3041.011223804" watchObservedRunningTime="2025-10-07 13:51:28.852715399 +0000 UTC m=+3041.013438076" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.075922 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svqh8" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.238289 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77xdr\" (UniqueName: \"kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr\") pod \"9d44e226-eb49-498b-a3c3-4fef79b4123e\" (UID: \"9d44e226-eb49-498b-a3c3-4fef79b4123e\") " Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.251247 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr" (OuterVolumeSpecName: "kube-api-access-77xdr") pod "9d44e226-eb49-498b-a3c3-4fef79b4123e" (UID: "9d44e226-eb49-498b-a3c3-4fef79b4123e"). InnerVolumeSpecName "kube-api-access-77xdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.340672 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77xdr\" (UniqueName: \"kubernetes.io/projected/9d44e226-eb49-498b-a3c3-4fef79b4123e-kube-api-access-77xdr\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.822922 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc40f402-6581-45a7-945f-a64d217724ab","Type":"ContainerStarted","Data":"ea2cb7f4ba487d946e478064d73367137bbd52e3e27cdcea925dd85430efc0ee"} Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.835568 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svqh8" event={"ID":"9d44e226-eb49-498b-a3c3-4fef79b4123e","Type":"ContainerDied","Data":"27e50214e22973dfa1aa8bddd52a24f2dfe8b9a0ce414ceb7dc383cab891f3a1"} Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.835859 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e50214e22973dfa1aa8bddd52a24f2dfe8b9a0ce414ceb7dc383cab891f3a1" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.835923 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svqh8" Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.843272 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77ff234e-dd31-4847-8517-4befe98845f7","Type":"ContainerStarted","Data":"e80272c4b363c241a8ce444b7b25521b544b85096320d7ad582d135622f93436"} Oct 07 13:51:29 crc kubenswrapper[4959]: I1007 13:51:29.849052 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.849034543 podStartE2EDuration="4.849034543s" podCreationTimestamp="2025-10-07 13:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:51:29.847186411 +0000 UTC m=+3042.007909088" watchObservedRunningTime="2025-10-07 13:51:29.849034543 +0000 UTC m=+3042.009757220" Oct 07 13:51:30 crc kubenswrapper[4959]: I1007 13:51:30.112229 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.11221055 podStartE2EDuration="5.11221055s" podCreationTimestamp="2025-10-07 13:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:51:29.89203981 +0000 UTC m=+3042.052762487" watchObservedRunningTime="2025-10-07 13:51:30.11221055 +0000 UTC m=+3042.272933227" Oct 07 13:51:31 crc kubenswrapper[4959]: I1007 13:51:31.229735 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:31 crc kubenswrapper[4959]: I1007 13:51:31.293482 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.471208 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.557741 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.558794 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3bfb-account-create-kcln7"] Oct 07 13:51:36 crc kubenswrapper[4959]: E1007 13:51:36.559188 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d44e226-eb49-498b-a3c3-4fef79b4123e" containerName="mariadb-database-create" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.559205 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d44e226-eb49-498b-a3c3-4fef79b4123e" containerName="mariadb-database-create" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.559392 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d44e226-eb49-498b-a3c3-4fef79b4123e" containerName="mariadb-database-create" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.560132 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.562382 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.572740 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3bfb-account-create-kcln7"] Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.688273 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2cw\" (UniqueName: \"kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw\") pod \"manila-3bfb-account-create-kcln7\" (UID: \"7fb1aef0-7687-47f6-a79b-515a5d4d6791\") " pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.790844 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2cw\" (UniqueName: \"kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw\") pod \"manila-3bfb-account-create-kcln7\" (UID: \"7fb1aef0-7687-47f6-a79b-515a5d4d6791\") " pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.847994 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2cw\" (UniqueName: \"kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw\") pod \"manila-3bfb-account-create-kcln7\" (UID: \"7fb1aef0-7687-47f6-a79b-515a5d4d6791\") " pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:36 crc kubenswrapper[4959]: I1007 13:51:36.878312 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.066728 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.067106 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.098307 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.113577 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.132523 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.132557 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.178474 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.193205 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.349569 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3bfb-account-create-kcln7"] Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.695480 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.695873 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.695929 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.696822 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.696890 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" gracePeriod=600 Oct 07 13:51:37 crc kubenswrapper[4959]: E1007 13:51:37.839170 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.903167 4959 generic.go:334] "Generic (PLEG): container finished" podID="7fb1aef0-7687-47f6-a79b-515a5d4d6791" containerID="bd4fae6001249ef6a531d909cef8b8ba89b596272f0158d0632c2ba1694c62da" exitCode=0 Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.903230 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3bfb-account-create-kcln7" event={"ID":"7fb1aef0-7687-47f6-a79b-515a5d4d6791","Type":"ContainerDied","Data":"bd4fae6001249ef6a531d909cef8b8ba89b596272f0158d0632c2ba1694c62da"} Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.903260 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3bfb-account-create-kcln7" event={"ID":"7fb1aef0-7687-47f6-a79b-515a5d4d6791","Type":"ContainerStarted","Data":"9b959439d2e77ab08c3c42a79dac919f85418ed63a4d58527b55209d4784e395"} Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905098 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" exitCode=0 Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905191 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe"} Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905236 4959 scope.go:117] "RemoveContainer" containerID="522065e10442a4399769eb986e144cd13cc82269fb86133b36dff3c473205e1d" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905621 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905661 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905673 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.905681 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:37 crc kubenswrapper[4959]: I1007 13:51:37.906054 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:51:37 crc kubenswrapper[4959]: E1007 13:51:37.906293 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.251108 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.338319 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v2cw\" (UniqueName: \"kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw\") pod \"7fb1aef0-7687-47f6-a79b-515a5d4d6791\" (UID: \"7fb1aef0-7687-47f6-a79b-515a5d4d6791\") " Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.344067 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw" (OuterVolumeSpecName: "kube-api-access-7v2cw") pod "7fb1aef0-7687-47f6-a79b-515a5d4d6791" (UID: "7fb1aef0-7687-47f6-a79b-515a5d4d6791"). InnerVolumeSpecName "kube-api-access-7v2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.440972 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v2cw\" (UniqueName: \"kubernetes.io/projected/7fb1aef0-7687-47f6-a79b-515a5d4d6791-kube-api-access-7v2cw\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.923235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3bfb-account-create-kcln7" event={"ID":"7fb1aef0-7687-47f6-a79b-515a5d4d6791","Type":"ContainerDied","Data":"9b959439d2e77ab08c3c42a79dac919f85418ed63a4d58527b55209d4784e395"} Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.923280 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b959439d2e77ab08c3c42a79dac919f85418ed63a4d58527b55209d4784e395" Oct 07 13:51:39 crc kubenswrapper[4959]: I1007 13:51:39.923291 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3bfb-account-create-kcln7" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.003750 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.004052 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.008016 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.021149 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.021262 4959 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:51:40 crc kubenswrapper[4959]: I1007 13:51:40.025890 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.915551 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-wwhr6"] Oct 07 13:51:41 crc kubenswrapper[4959]: E1007 13:51:41.916415 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb1aef0-7687-47f6-a79b-515a5d4d6791" containerName="mariadb-account-create" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.916428 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb1aef0-7687-47f6-a79b-515a5d4d6791" containerName="mariadb-account-create" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.916607 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb1aef0-7687-47f6-a79b-515a5d4d6791" containerName="mariadb-account-create" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.917333 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.922284 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h5cv5" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.923964 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 13:51:41 crc kubenswrapper[4959]: I1007 13:51:41.931521 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wwhr6"] Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.087763 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.087848 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.087881 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfv8w\" (UniqueName: \"kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.087928 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.189783 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.189875 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.189906 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfv8w\" (UniqueName: \"kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.189928 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.195714 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.195879 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.196811 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.207455 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfv8w\" (UniqueName: \"kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w\") pod \"manila-db-sync-wwhr6\" (UID: \"f40be374-6671-451c-a271-163847256266\") " pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.257076 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wwhr6" Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.796435 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wwhr6"] Oct 07 13:51:42 crc kubenswrapper[4959]: W1007 13:51:42.797751 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf40be374_6671_451c_a271_163847256266.slice/crio-2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95 WatchSource:0}: Error finding container 2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95: Status 404 returned error can't find the container with id 2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95 Oct 07 13:51:42 crc kubenswrapper[4959]: I1007 13:51:42.955384 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wwhr6" event={"ID":"f40be374-6671-451c-a271-163847256266","Type":"ContainerStarted","Data":"2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95"} Oct 07 13:51:49 crc kubenswrapper[4959]: I1007 13:51:49.012297 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wwhr6" event={"ID":"f40be374-6671-451c-a271-163847256266","Type":"ContainerStarted","Data":"f15e79bc5c4b138c559fd8beeab050a72a723fec38087cc426049a56d8d9e04f"} Oct 07 13:51:49 crc kubenswrapper[4959]: I1007 13:51:49.034779 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-wwhr6" podStartSLOduration=2.959036539 podStartE2EDuration="8.034761719s" podCreationTimestamp="2025-10-07 13:51:41 +0000 UTC" firstStartedPulling="2025-10-07 13:51:42.800175337 +0000 UTC m=+3054.960898014" lastFinishedPulling="2025-10-07 13:51:47.875900517 +0000 UTC m=+3060.036623194" observedRunningTime="2025-10-07 13:51:49.03162021 +0000 UTC m=+3061.192342897" watchObservedRunningTime="2025-10-07 13:51:49.034761719 +0000 UTC m=+3061.195484396" Oct 07 13:51:50 crc kubenswrapper[4959]: I1007 13:51:50.809674 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:51:50 crc kubenswrapper[4959]: E1007 13:51:50.810273 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.298470 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.301954 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.308818 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.480996 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67mc\" (UniqueName: \"kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.481129 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.481228 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.583043 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.583119 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67mc\" (UniqueName: \"kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.583222 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.583671 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.583742 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.611850 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67mc\" (UniqueName: \"kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc\") pod \"redhat-marketplace-8v4ps\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:56 crc kubenswrapper[4959]: I1007 13:51:56.697297 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:51:57 crc kubenswrapper[4959]: I1007 13:51:57.138057 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:51:57 crc kubenswrapper[4959]: W1007 13:51:57.150101 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5f262c7_0697_441b_ab95_f543fdfbde54.slice/crio-1a76b3e5faa62009ee6cf5f10193f516b69cd74b338a44c7399dafe40df69ba9 WatchSource:0}: Error finding container 1a76b3e5faa62009ee6cf5f10193f516b69cd74b338a44c7399dafe40df69ba9: Status 404 returned error can't find the container with id 1a76b3e5faa62009ee6cf5f10193f516b69cd74b338a44c7399dafe40df69ba9 Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.087867 4959 generic.go:334] "Generic (PLEG): container finished" podID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerID="6706440ea2a5628d6a781ec27f1780f480b8c7112b07b8410d894a91b3885823" exitCode=0 Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.087947 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerDied","Data":"6706440ea2a5628d6a781ec27f1780f480b8c7112b07b8410d894a91b3885823"} Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.088222 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerStarted","Data":"1a76b3e5faa62009ee6cf5f10193f516b69cd74b338a44c7399dafe40df69ba9"} Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.696872 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.701971 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.707142 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.821640 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s87\" (UniqueName: \"kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.821717 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.821760 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.924033 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s87\" (UniqueName: \"kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.924111 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.924141 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.924836 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.924970 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:58 crc kubenswrapper[4959]: I1007 13:51:58.943670 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s87\" (UniqueName: \"kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87\") pod \"community-operators-fw59x\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:59 crc kubenswrapper[4959]: I1007 13:51:59.033645 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:51:59 crc kubenswrapper[4959]: I1007 13:51:59.412813 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:52:00 crc kubenswrapper[4959]: I1007 13:52:00.109842 4959 generic.go:334] "Generic (PLEG): container finished" podID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerID="3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132" exitCode=0 Oct 07 13:52:00 crc kubenswrapper[4959]: I1007 13:52:00.109951 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerDied","Data":"3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132"} Oct 07 13:52:00 crc kubenswrapper[4959]: I1007 13:52:00.110290 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerStarted","Data":"1a87d7f252ed026f3c8b7d29873a6526731d7d9dc9cfc1ee36e725afb12f9097"} Oct 07 13:52:00 crc kubenswrapper[4959]: I1007 13:52:00.113505 4959 generic.go:334] "Generic (PLEG): container finished" podID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerID="f594996b50b34e48db9f89ac3c6225498137f5c5f5293343fda155a2920912f5" exitCode=0 Oct 07 13:52:00 crc kubenswrapper[4959]: I1007 13:52:00.113548 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerDied","Data":"f594996b50b34e48db9f89ac3c6225498137f5c5f5293343fda155a2920912f5"} Oct 07 13:52:01 crc kubenswrapper[4959]: I1007 13:52:01.122173 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerStarted","Data":"cf16cb76e9da9567d0c4a6e2108094f657b3cd648dbc1930b406a512206f8b99"} Oct 07 13:52:01 crc kubenswrapper[4959]: I1007 13:52:01.123656 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerStarted","Data":"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65"} Oct 07 13:52:01 crc kubenswrapper[4959]: I1007 13:52:01.125437 4959 generic.go:334] "Generic (PLEG): container finished" podID="f40be374-6671-451c-a271-163847256266" containerID="f15e79bc5c4b138c559fd8beeab050a72a723fec38087cc426049a56d8d9e04f" exitCode=0 Oct 07 13:52:01 crc kubenswrapper[4959]: I1007 13:52:01.125484 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wwhr6" event={"ID":"f40be374-6671-451c-a271-163847256266","Type":"ContainerDied","Data":"f15e79bc5c4b138c559fd8beeab050a72a723fec38087cc426049a56d8d9e04f"} Oct 07 13:52:01 crc kubenswrapper[4959]: I1007 13:52:01.146404 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8v4ps" podStartSLOduration=2.702861243 podStartE2EDuration="5.146381038s" podCreationTimestamp="2025-10-07 13:51:56 +0000 UTC" firstStartedPulling="2025-10-07 13:51:58.090179965 +0000 UTC m=+3070.250902642" lastFinishedPulling="2025-10-07 13:52:00.53369975 +0000 UTC m=+3072.694422437" observedRunningTime="2025-10-07 13:52:01.140316946 +0000 UTC m=+3073.301039633" watchObservedRunningTime="2025-10-07 13:52:01.146381038 +0000 UTC m=+3073.307103715" Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.138152 4959 generic.go:334] "Generic (PLEG): container finished" podID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerID="d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65" exitCode=0 Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.138247 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerDied","Data":"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65"} Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.735101 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wwhr6" Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.898702 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle\") pod \"f40be374-6671-451c-a271-163847256266\" (UID: \"f40be374-6671-451c-a271-163847256266\") " Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.899146 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data\") pod \"f40be374-6671-451c-a271-163847256266\" (UID: \"f40be374-6671-451c-a271-163847256266\") " Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.899184 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data\") pod \"f40be374-6671-451c-a271-163847256266\" (UID: \"f40be374-6671-451c-a271-163847256266\") " Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.899369 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfv8w\" (UniqueName: \"kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w\") pod \"f40be374-6671-451c-a271-163847256266\" (UID: \"f40be374-6671-451c-a271-163847256266\") " Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.906115 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "f40be374-6671-451c-a271-163847256266" (UID: "f40be374-6671-451c-a271-163847256266"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.906203 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w" (OuterVolumeSpecName: "kube-api-access-rfv8w") pod "f40be374-6671-451c-a271-163847256266" (UID: "f40be374-6671-451c-a271-163847256266"). InnerVolumeSpecName "kube-api-access-rfv8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.910924 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data" (OuterVolumeSpecName: "config-data") pod "f40be374-6671-451c-a271-163847256266" (UID: "f40be374-6671-451c-a271-163847256266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:02 crc kubenswrapper[4959]: I1007 13:52:02.930370 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f40be374-6671-451c-a271-163847256266" (UID: "f40be374-6671-451c-a271-163847256266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.005136 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.005175 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.005185 4959 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f40be374-6671-451c-a271-163847256266-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.005195 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfv8w\" (UniqueName: \"kubernetes.io/projected/f40be374-6671-451c-a271-163847256266-kube-api-access-rfv8w\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.148953 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wwhr6" event={"ID":"f40be374-6671-451c-a271-163847256266","Type":"ContainerDied","Data":"2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95"} Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.148984 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wwhr6" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.149017 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b00c88a9ce7e1462738b65ca6d5c2aeb68c2fbf646a29be4527f2c512d62a95" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.151800 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerStarted","Data":"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e"} Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.174418 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fw59x" podStartSLOduration=2.357269417 podStartE2EDuration="5.174401715s" podCreationTimestamp="2025-10-07 13:51:58 +0000 UTC" firstStartedPulling="2025-10-07 13:52:00.113013126 +0000 UTC m=+3072.273735843" lastFinishedPulling="2025-10-07 13:52:02.930145464 +0000 UTC m=+3075.090868141" observedRunningTime="2025-10-07 13:52:03.166593025 +0000 UTC m=+3075.327315722" watchObservedRunningTime="2025-10-07 13:52:03.174401715 +0000 UTC m=+3075.335124392" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.544442 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: E1007 13:52:03.545258 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40be374-6671-451c-a271-163847256266" containerName="manila-db-sync" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.545278 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40be374-6671-451c-a271-163847256266" containerName="manila-db-sync" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.545530 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f40be374-6671-451c-a271-163847256266" containerName="manila-db-sync" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.551327 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.554439 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.557104 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.557823 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.558069 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.558156 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h5cv5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.558228 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.565102 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.568457 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.586701 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.621924 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.621971 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4m6\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.621994 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622008 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622050 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622064 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622081 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622120 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprxx\" (UniqueName: \"kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622137 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622153 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622173 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622204 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622221 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.622261 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.643147 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d8975557-jwvh5"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.644651 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.664492 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d8975557-jwvh5"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723548 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723638 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723731 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723755 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4m6\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.723776 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724072 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724758 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724835 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724864 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724883 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724941 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprxx\" (UniqueName: \"kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724964 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.724982 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.725014 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.725143 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.725231 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.730548 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.732212 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.733809 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.733913 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.734111 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.734588 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.735175 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.739965 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.744694 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.751769 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4m6\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6\") pod \"manila-share-share1-0\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.755356 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprxx\" (UniqueName: \"kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx\") pod \"manila-scheduler-0\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.796244 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.798019 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.800857 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.810510 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.828964 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-config\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831060 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-sb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831233 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831341 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8nw\" (UniqueName: \"kubernetes.io/projected/e8fbe198-197d-4725-acfc-c846f5b5c32a-kube-api-access-sm8nw\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831454 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831561 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.831688 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.832595 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-nb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.832797 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.832920 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-dns-svc\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.833121 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.833230 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4crng\" (UniqueName: \"kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.833381 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.892931 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.904279 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937651 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937740 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8nw\" (UniqueName: \"kubernetes.io/projected/e8fbe198-197d-4725-acfc-c846f5b5c32a-kube-api-access-sm8nw\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937773 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937800 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937839 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.937932 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-nb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938069 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938099 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-dns-svc\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938123 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938143 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4crng\" (UniqueName: \"kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938197 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938271 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-config\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.938323 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-sb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.939545 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-sb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.941517 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.941592 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.942307 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-config\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.942901 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.945029 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-dns-svc\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.945258 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fbe198-197d-4725-acfc-c846f5b5c32a-ovsdbserver-nb\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.947511 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.955319 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.960418 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.960428 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.963401 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4crng\" (UniqueName: \"kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng\") pod \"manila-api-0\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " pod="openstack/manila-api-0" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.976368 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8nw\" (UniqueName: \"kubernetes.io/projected/e8fbe198-197d-4725-acfc-c846f5b5c32a-kube-api-access-sm8nw\") pod \"dnsmasq-dns-55d8975557-jwvh5\" (UID: \"e8fbe198-197d-4725-acfc-c846f5b5c32a\") " pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:03 crc kubenswrapper[4959]: I1007 13:52:03.979233 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.149988 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.490889 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:04 crc kubenswrapper[4959]: W1007 13:52:04.659238 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346eefdf_f870_4709_8f2e_6c26492cb22a.slice/crio-08ef13277420b7168b6e8cedfbd4ecce267c692f35f163419d5028a738cfc263 WatchSource:0}: Error finding container 08ef13277420b7168b6e8cedfbd4ecce267c692f35f163419d5028a738cfc263: Status 404 returned error can't find the container with id 08ef13277420b7168b6e8cedfbd4ecce267c692f35f163419d5028a738cfc263 Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.659576 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:04 crc kubenswrapper[4959]: W1007 13:52:04.685589 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fbe198_197d_4725_acfc_c846f5b5c32a.slice/crio-15801da82d30720b3241ae1047c1ac5b5ebb3e4b0df50352fdf366b66d1ac9a0 WatchSource:0}: Error finding container 15801da82d30720b3241ae1047c1ac5b5ebb3e4b0df50352fdf366b66d1ac9a0: Status 404 returned error can't find the container with id 15801da82d30720b3241ae1047c1ac5b5ebb3e4b0df50352fdf366b66d1ac9a0 Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.687688 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d8975557-jwvh5"] Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.809362 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:52:04 crc kubenswrapper[4959]: E1007 13:52:04.809847 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:52:04 crc kubenswrapper[4959]: W1007 13:52:04.876541 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda287d3d9_022c_464c_a618_e9a039a895ae.slice/crio-f9b90feb2d255d1dea65e08e3cc942586349449cb4a2d1789880fc0d10e2c48d WatchSource:0}: Error finding container f9b90feb2d255d1dea65e08e3cc942586349449cb4a2d1789880fc0d10e2c48d: Status 404 returned error can't find the container with id f9b90feb2d255d1dea65e08e3cc942586349449cb4a2d1789880fc0d10e2c48d Oct 07 13:52:04 crc kubenswrapper[4959]: I1007 13:52:04.880818 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.179836 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerStarted","Data":"08ef13277420b7168b6e8cedfbd4ecce267c692f35f163419d5028a738cfc263"} Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.181260 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerStarted","Data":"f9b90feb2d255d1dea65e08e3cc942586349449cb4a2d1789880fc0d10e2c48d"} Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.188401 4959 generic.go:334] "Generic (PLEG): container finished" podID="e8fbe198-197d-4725-acfc-c846f5b5c32a" containerID="74b530ef2007021364d1dc61e3299d0d45130510ae131149b64d7194b6eb86e2" exitCode=0 Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.188502 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" event={"ID":"e8fbe198-197d-4725-acfc-c846f5b5c32a","Type":"ContainerDied","Data":"74b530ef2007021364d1dc61e3299d0d45130510ae131149b64d7194b6eb86e2"} Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.188536 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" event={"ID":"e8fbe198-197d-4725-acfc-c846f5b5c32a","Type":"ContainerStarted","Data":"15801da82d30720b3241ae1047c1ac5b5ebb3e4b0df50352fdf366b66d1ac9a0"} Oct 07 13:52:05 crc kubenswrapper[4959]: I1007 13:52:05.191182 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerStarted","Data":"1f941d31dc02618ff4e9a1d97b19e85724f1dfc4344fdbfbba7aee25b70f59f1"} Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.239052 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" event={"ID":"e8fbe198-197d-4725-acfc-c846f5b5c32a","Type":"ContainerStarted","Data":"98398f283864e1ed97c6aae2d3198e50766caea76ac0148d001a429fae66b3b5"} Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.240113 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.250646 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerStarted","Data":"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f"} Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.254816 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerStarted","Data":"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3"} Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.254847 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerStarted","Data":"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d"} Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.255739 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.300514 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" podStartSLOduration=3.300497697 podStartE2EDuration="3.300497697s" podCreationTimestamp="2025-10-07 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:52:06.272053382 +0000 UTC m=+3078.432776079" watchObservedRunningTime="2025-10-07 13:52:06.300497697 +0000 UTC m=+3078.461220374" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.300838 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.3008336160000002 podStartE2EDuration="3.300833616s" podCreationTimestamp="2025-10-07 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:52:06.295389832 +0000 UTC m=+3078.456112519" watchObservedRunningTime="2025-10-07 13:52:06.300833616 +0000 UTC m=+3078.461556293" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.686345 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.697490 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.697551 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:06 crc kubenswrapper[4959]: I1007 13:52:06.755387 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:07 crc kubenswrapper[4959]: I1007 13:52:07.273255 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerStarted","Data":"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b"} Oct 07 13:52:07 crc kubenswrapper[4959]: I1007 13:52:07.293610 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.743108541 podStartE2EDuration="4.293581598s" podCreationTimestamp="2025-10-07 13:52:03 +0000 UTC" firstStartedPulling="2025-10-07 13:52:04.489452478 +0000 UTC m=+3076.650175155" lastFinishedPulling="2025-10-07 13:52:05.039925535 +0000 UTC m=+3077.200648212" observedRunningTime="2025-10-07 13:52:07.288571396 +0000 UTC m=+3079.449294083" watchObservedRunningTime="2025-10-07 13:52:07.293581598 +0000 UTC m=+3079.454304265" Oct 07 13:52:07 crc kubenswrapper[4959]: I1007 13:52:07.335876 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:07 crc kubenswrapper[4959]: I1007 13:52:07.385197 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.279017 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api-log" containerID="cri-o://94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" gracePeriod=30 Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.279052 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api" containerID="cri-o://0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" gracePeriod=30 Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.963878 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.979537 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.979679 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:08 crc kubenswrapper[4959]: I1007 13:52:08.980234 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.024384 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.034133 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.034208 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.080802 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081106 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081147 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081172 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081198 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4crng\" (UniqueName: \"kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng\") pod \"a287d3d9-022c-464c-a618-e9a039a895ae\" (UID: \"a287d3d9-022c-464c-a618-e9a039a895ae\") " Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081769 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081793 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a287d3d9-022c-464c-a618-e9a039a895ae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.081924 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs" (OuterVolumeSpecName: "logs") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.084656 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts" (OuterVolumeSpecName: "scripts") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.085206 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng" (OuterVolumeSpecName: "kube-api-access-4crng") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "kube-api-access-4crng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.085419 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.099425 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.135840 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data" (OuterVolumeSpecName: "config-data") pod "a287d3d9-022c-464c-a618-e9a039a895ae" (UID: "a287d3d9-022c-464c-a618-e9a039a895ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.183204 4959 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a287d3d9-022c-464c-a618-e9a039a895ae-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.183234 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.183245 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.183254 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4crng\" (UniqueName: \"kubernetes.io/projected/a287d3d9-022c-464c-a618-e9a039a895ae-kube-api-access-4crng\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.183262 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a287d3d9-022c-464c-a618-e9a039a895ae-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.234958 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.235406 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="proxy-httpd" containerID="cri-o://c65d12018f1beac17f92afa2e2e6418f6f617507dfac1720c2f32dfa96d80c95" gracePeriod=30 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.235455 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-notification-agent" containerID="cri-o://04a381b3bf039da37374850b0bddb599b4be36decc14cca6e1825ff7a2aececb" gracePeriod=30 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.235423 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="sg-core" containerID="cri-o://289627a5d6d5996a028bb6befef0e7a2931e7337f456bf41ae79cde2cc8c3c13" gracePeriod=30 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.235368 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-central-agent" containerID="cri-o://051b7ddb5928aaeabb43744b419252997d6acc6334fd0f07a1cf59d37a66edc8" gracePeriod=30 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289444 4959 generic.go:334] "Generic (PLEG): container finished" podID="a287d3d9-022c-464c-a618-e9a039a895ae" containerID="0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" exitCode=0 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289476 4959 generic.go:334] "Generic (PLEG): container finished" podID="a287d3d9-022c-464c-a618-e9a039a895ae" containerID="94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" exitCode=143 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289482 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerDied","Data":"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3"} Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289506 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289518 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerDied","Data":"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d"} Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289529 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a287d3d9-022c-464c-a618-e9a039a895ae","Type":"ContainerDied","Data":"f9b90feb2d255d1dea65e08e3cc942586349449cb4a2d1789880fc0d10e2c48d"} Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.289550 4959 scope.go:117] "RemoveContainer" containerID="0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.290737 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8v4ps" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="registry-server" containerID="cri-o://cf16cb76e9da9567d0c4a6e2108094f657b3cd648dbc1930b406a512206f8b99" gracePeriod=2 Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.331686 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.351534 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.362719 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.367709 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:09 crc kubenswrapper[4959]: E1007 13:52:09.368259 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.368282 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api" Oct 07 13:52:09 crc kubenswrapper[4959]: E1007 13:52:09.368318 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api-log" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.368326 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api-log" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.368644 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.368672 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" containerName="manila-api-log" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.369957 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.373684 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.373842 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.373949 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404370 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data-custom\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404416 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-public-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404468 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404510 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-scripts\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404541 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404564 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmcp\" (UniqueName: \"kubernetes.io/projected/640ef79d-5203-4f5e-8119-2f1eecb02bf1-kube-api-access-hhmcp\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404589 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404647 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640ef79d-5203-4f5e-8119-2f1eecb02bf1-logs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.404672 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/640ef79d-5203-4f5e-8119-2f1eecb02bf1-etc-machine-id\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:09 crc kubenswrapper[4959]: I1007 13:52:09.426221 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507074 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data-custom\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507113 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-public-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507169 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507204 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-scripts\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507247 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507263 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmcp\" (UniqueName: \"kubernetes.io/projected/640ef79d-5203-4f5e-8119-2f1eecb02bf1-kube-api-access-hhmcp\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.507284 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.508681 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640ef79d-5203-4f5e-8119-2f1eecb02bf1-logs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.508717 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/640ef79d-5203-4f5e-8119-2f1eecb02bf1-etc-machine-id\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.508849 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/640ef79d-5203-4f5e-8119-2f1eecb02bf1-etc-machine-id\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.508997 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640ef79d-5203-4f5e-8119-2f1eecb02bf1-logs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.511132 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data-custom\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.512153 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.512175 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-scripts\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.512470 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-config-data\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.516189 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-public-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.524067 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/640ef79d-5203-4f5e-8119-2f1eecb02bf1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.525384 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmcp\" (UniqueName: \"kubernetes.io/projected/640ef79d-5203-4f5e-8119-2f1eecb02bf1-kube-api-access-hhmcp\") pod \"manila-api-0\" (UID: \"640ef79d-5203-4f5e-8119-2f1eecb02bf1\") " pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.690473 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:09.707905 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311261 4959 generic.go:334] "Generic (PLEG): container finished" podID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerID="c65d12018f1beac17f92afa2e2e6418f6f617507dfac1720c2f32dfa96d80c95" exitCode=0 Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311670 4959 generic.go:334] "Generic (PLEG): container finished" podID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerID="289627a5d6d5996a028bb6befef0e7a2931e7337f456bf41ae79cde2cc8c3c13" exitCode=2 Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311706 4959 generic.go:334] "Generic (PLEG): container finished" podID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerID="051b7ddb5928aaeabb43744b419252997d6acc6334fd0f07a1cf59d37a66edc8" exitCode=0 Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311306 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerDied","Data":"c65d12018f1beac17f92afa2e2e6418f6f617507dfac1720c2f32dfa96d80c95"} Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311767 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerDied","Data":"289627a5d6d5996a028bb6befef0e7a2931e7337f456bf41ae79cde2cc8c3c13"} Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.311783 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerDied","Data":"051b7ddb5928aaeabb43744b419252997d6acc6334fd0f07a1cf59d37a66edc8"} Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.316607 4959 generic.go:334] "Generic (PLEG): container finished" podID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerID="cf16cb76e9da9567d0c4a6e2108094f657b3cd648dbc1930b406a512206f8b99" exitCode=0 Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.316655 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerDied","Data":"cf16cb76e9da9567d0c4a6e2108094f657b3cd648dbc1930b406a512206f8b99"} Oct 07 13:52:10 crc kubenswrapper[4959]: I1007 13:52:10.827803 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a287d3d9-022c-464c-a618-e9a039a895ae" path="/var/lib/kubelet/pods/a287d3d9-022c-464c-a618-e9a039a895ae/volumes" Oct 07 13:52:11 crc kubenswrapper[4959]: I1007 13:52:11.328727 4959 generic.go:334] "Generic (PLEG): container finished" podID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerID="04a381b3bf039da37374850b0bddb599b4be36decc14cca6e1825ff7a2aececb" exitCode=0 Oct 07 13:52:11 crc kubenswrapper[4959]: I1007 13:52:11.328805 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerDied","Data":"04a381b3bf039da37374850b0bddb599b4be36decc14cca6e1825ff7a2aececb"} Oct 07 13:52:11 crc kubenswrapper[4959]: I1007 13:52:11.329249 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fw59x" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="registry-server" containerID="cri-o://816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e" gracePeriod=2 Oct 07 13:52:11 crc kubenswrapper[4959]: I1007 13:52:11.978182 4959 scope.go:117] "RemoveContainer" containerID="94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.153646 4959 scope.go:117] "RemoveContainer" containerID="0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" Oct 07 13:52:12 crc kubenswrapper[4959]: E1007 13:52:12.155534 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3\": container with ID starting with 0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3 not found: ID does not exist" containerID="0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.155808 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3"} err="failed to get container status \"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3\": rpc error: code = NotFound desc = could not find container \"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3\": container with ID starting with 0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3 not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.155834 4959 scope.go:117] "RemoveContainer" containerID="94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" Oct 07 13:52:12 crc kubenswrapper[4959]: E1007 13:52:12.169444 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d\": container with ID starting with 94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d not found: ID does not exist" containerID="94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.169499 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d"} err="failed to get container status \"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d\": rpc error: code = NotFound desc = could not find container \"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d\": container with ID starting with 94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.169532 4959 scope.go:117] "RemoveContainer" containerID="0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.170650 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3"} err="failed to get container status \"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3\": rpc error: code = NotFound desc = could not find container \"0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3\": container with ID starting with 0a9d3d65717c0634b6062a70a46af8c2adac82384df6585d7910f8b17b4a86d3 not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.170693 4959 scope.go:117] "RemoveContainer" containerID="94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.170973 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d"} err="failed to get container status \"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d\": rpc error: code = NotFound desc = could not find container \"94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d\": container with ID starting with 94bb5cfd226c4227c061fa87711539be3fc2a8b2d600b532203cc5ca4f54d64d not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.319460 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.364320 4959 generic.go:334] "Generic (PLEG): container finished" podID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerID="816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e" exitCode=0 Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.364362 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerDied","Data":"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e"} Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.364387 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw59x" event={"ID":"2a0efd29-49c2-420a-ac68-b479aa8a76d5","Type":"ContainerDied","Data":"1a87d7f252ed026f3c8b7d29873a6526731d7d9dc9cfc1ee36e725afb12f9097"} Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.364402 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw59x" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.364408 4959 scope.go:117] "RemoveContainer" containerID="816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.371246 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content\") pod \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.371371 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities\") pod \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.371460 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s87\" (UniqueName: \"kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87\") pod \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\" (UID: \"2a0efd29-49c2-420a-ac68-b479aa8a76d5\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.378197 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities" (OuterVolumeSpecName: "utilities") pod "2a0efd29-49c2-420a-ac68-b479aa8a76d5" (UID: "2a0efd29-49c2-420a-ac68-b479aa8a76d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.378580 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87" (OuterVolumeSpecName: "kube-api-access-h8s87") pod "2a0efd29-49c2-420a-ac68-b479aa8a76d5" (UID: "2a0efd29-49c2-420a-ac68-b479aa8a76d5"). InnerVolumeSpecName "kube-api-access-h8s87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.426787 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.462503 4959 scope.go:117] "RemoveContainer" containerID="d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.474959 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.475271 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.475934 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.475954 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s87\" (UniqueName: \"kubernetes.io/projected/2a0efd29-49c2-420a-ac68-b479aa8a76d5-kube-api-access-h8s87\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.481274 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts" (OuterVolumeSpecName: "scripts") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.496509 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.500942 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a0efd29-49c2-420a-ac68-b479aa8a76d5" (UID: "2a0efd29-49c2-420a-ac68-b479aa8a76d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.569903 4959 scope.go:117] "RemoveContainer" containerID="3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576524 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576668 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576776 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7wz5\" (UniqueName: \"kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576798 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576821 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.576870 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml\") pod \"e5e1f006-c462-46be-b191-aab45dd3d1b7\" (UID: \"e5e1f006-c462-46be-b191-aab45dd3d1b7\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.577294 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0efd29-49c2-420a-ac68-b479aa8a76d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.577306 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.577682 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.580236 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.586323 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5" (OuterVolumeSpecName: "kube-api-access-z7wz5") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "kube-api-access-z7wz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.634895 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.635838 4959 scope.go:117] "RemoveContainer" containerID="816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e" Oct 07 13:52:12 crc kubenswrapper[4959]: E1007 13:52:12.636488 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e\": container with ID starting with 816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e not found: ID does not exist" containerID="816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.636529 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e"} err="failed to get container status \"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e\": rpc error: code = NotFound desc = could not find container \"816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e\": container with ID starting with 816bc194abfbc9439b7847cc22b9f7849ffae368ada6c61d99829bb0eb33128e not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.636555 4959 scope.go:117] "RemoveContainer" containerID="d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65" Oct 07 13:52:12 crc kubenswrapper[4959]: E1007 13:52:12.637068 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65\": container with ID starting with d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65 not found: ID does not exist" containerID="d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.637101 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65"} err="failed to get container status \"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65\": rpc error: code = NotFound desc = could not find container \"d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65\": container with ID starting with d1abd49d763328fac3ac24adeb9dda1e5f47579ea64b15f32c895b71a525bb65 not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.637120 4959 scope.go:117] "RemoveContainer" containerID="3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132" Oct 07 13:52:12 crc kubenswrapper[4959]: E1007 13:52:12.637425 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132\": container with ID starting with 3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132 not found: ID does not exist" containerID="3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.637454 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132"} err="failed to get container status \"3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132\": rpc error: code = NotFound desc = could not find container \"3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132\": container with ID starting with 3a4118f4ba39471ded5a8af7f472ada64e1865468746c5ee22880e436378f132 not found: ID does not exist" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.639084 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.667775 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.678490 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content\") pod \"a5f262c7-0697-441b-ab95-f543fdfbde54\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.678614 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities\") pod \"a5f262c7-0697-441b-ab95-f543fdfbde54\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.678713 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67mc\" (UniqueName: \"kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc\") pod \"a5f262c7-0697-441b-ab95-f543fdfbde54\" (UID: \"a5f262c7-0697-441b-ab95-f543fdfbde54\") " Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679289 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679313 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679328 4959 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679341 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7wz5\" (UniqueName: \"kubernetes.io/projected/e5e1f006-c462-46be-b191-aab45dd3d1b7-kube-api-access-z7wz5\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679351 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5e1f006-c462-46be-b191-aab45dd3d1b7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.679363 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.680547 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities" (OuterVolumeSpecName: "utilities") pod "a5f262c7-0697-441b-ab95-f543fdfbde54" (UID: "a5f262c7-0697-441b-ab95-f543fdfbde54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.681969 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.682431 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc" (OuterVolumeSpecName: "kube-api-access-f67mc") pod "a5f262c7-0697-441b-ab95-f543fdfbde54" (UID: "a5f262c7-0697-441b-ab95-f543fdfbde54"). InnerVolumeSpecName "kube-api-access-f67mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.721026 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5f262c7-0697-441b-ab95-f543fdfbde54" (UID: "a5f262c7-0697-441b-ab95-f543fdfbde54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.740024 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.749620 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data" (OuterVolumeSpecName: "config-data") pod "e5e1f006-c462-46be-b191-aab45dd3d1b7" (UID: "e5e1f006-c462-46be-b191-aab45dd3d1b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.750149 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fw59x"] Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.780871 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.780925 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e1f006-c462-46be-b191-aab45dd3d1b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.780939 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f262c7-0697-441b-ab95-f543fdfbde54-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.780979 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67mc\" (UniqueName: \"kubernetes.io/projected/a5f262c7-0697-441b-ab95-f543fdfbde54-kube-api-access-f67mc\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:12 crc kubenswrapper[4959]: I1007 13:52:12.819192 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" path="/var/lib/kubelet/pods/2a0efd29-49c2-420a-ac68-b479aa8a76d5/volumes" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.383949 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"640ef79d-5203-4f5e-8119-2f1eecb02bf1","Type":"ContainerStarted","Data":"3152b2ddddb408a847818a7b1ecee155d135e5b860193779db29f38fd013f5c1"} Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.384313 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"640ef79d-5203-4f5e-8119-2f1eecb02bf1","Type":"ContainerStarted","Data":"f02978386a088bb5b339ccba4a44cba80c18564a6ed1324a12ec0e9d30a2df8c"} Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.387005 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5e1f006-c462-46be-b191-aab45dd3d1b7","Type":"ContainerDied","Data":"0b7da3a18f1781d138dcb60e6b86254b90d558a04debb085d77979fa5bfe162f"} Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.387076 4959 scope.go:117] "RemoveContainer" containerID="c65d12018f1beac17f92afa2e2e6418f6f617507dfac1720c2f32dfa96d80c95" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.387142 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.393874 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8v4ps" event={"ID":"a5f262c7-0697-441b-ab95-f543fdfbde54","Type":"ContainerDied","Data":"1a76b3e5faa62009ee6cf5f10193f516b69cd74b338a44c7399dafe40df69ba9"} Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.393989 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8v4ps" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.398763 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerStarted","Data":"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735"} Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.424305 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.435285 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.446539 4959 scope.go:117] "RemoveContainer" containerID="289627a5d6d5996a028bb6befef0e7a2931e7337f456bf41ae79cde2cc8c3c13" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.447103 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460348 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.460866 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="extract-content" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460885 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="extract-content" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.460900 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460909 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.460937 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="sg-core" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460948 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="sg-core" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.460964 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="extract-content" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460972 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="extract-content" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.460986 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-notification-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.460994 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-notification-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.461010 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461017 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.461036 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="proxy-httpd" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461042 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="proxy-httpd" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.461059 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="extract-utilities" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461066 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="extract-utilities" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.461077 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="extract-utilities" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461084 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="extract-utilities" Oct 07 13:52:13 crc kubenswrapper[4959]: E1007 13:52:13.461098 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-central-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461113 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-central-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461331 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-central-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461346 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="ceilometer-notification-agent" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461361 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0efd29-49c2-420a-ac68-b479aa8a76d5" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461374 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="proxy-httpd" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461390 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" containerName="registry-server" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.461413 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" containerName="sg-core" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.463408 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.465760 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.466102 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.466248 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.471515 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8v4ps"] Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.482679 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.504087 4959 scope.go:117] "RemoveContainer" containerID="04a381b3bf039da37374850b0bddb599b4be36decc14cca6e1825ff7a2aececb" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.533096 4959 scope.go:117] "RemoveContainer" containerID="051b7ddb5928aaeabb43744b419252997d6acc6334fd0f07a1cf59d37a66edc8" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.561564 4959 scope.go:117] "RemoveContainer" containerID="cf16cb76e9da9567d0c4a6e2108094f657b3cd648dbc1930b406a512206f8b99" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596595 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596667 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596693 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596729 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2mr\" (UniqueName: \"kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596762 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596786 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596838 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.596859 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.597002 4959 scope.go:117] "RemoveContainer" containerID="f594996b50b34e48db9f89ac3c6225498137f5c5f5293343fda155a2920912f5" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.633273 4959 scope.go:117] "RemoveContainer" containerID="6706440ea2a5628d6a781ec27f1780f480b8c7112b07b8410d894a91b3885823" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698567 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698674 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698764 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698796 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698944 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.698997 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.699031 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.699067 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2mr\" (UniqueName: \"kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.699177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.701714 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.704528 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.705050 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.705096 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.706021 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.711755 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.717292 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2mr\" (UniqueName: \"kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr\") pod \"ceilometer-0\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.793256 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.893636 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 07 13:52:13 crc kubenswrapper[4959]: I1007 13:52:13.981813 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d8975557-jwvh5" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.061954 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.062166 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="dnsmasq-dns" containerID="cri-o://bdfdfcce1c2bf9bcbf7c1fdeafa9b7a419064d1e2802f8324574900e53af9f42" gracePeriod=10 Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.319646 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.423826 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"640ef79d-5203-4f5e-8119-2f1eecb02bf1","Type":"ContainerStarted","Data":"d073cc5058c4ac0db9e58ab9b8f4c460b29f513eef72b1713f3b414175b241dc"} Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.424756 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.429468 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerStarted","Data":"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9"} Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.430602 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerStarted","Data":"da9779e927a48f62471bd502260dcb7f70ad40c75ea1ea7115ab255d18fabcfa"} Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.437903 4959 generic.go:334] "Generic (PLEG): container finished" podID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerID="bdfdfcce1c2bf9bcbf7c1fdeafa9b7a419064d1e2802f8324574900e53af9f42" exitCode=0 Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.437953 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" event={"ID":"849a94c5-f31f-4068-b47c-fb1163b6afc0","Type":"ContainerDied","Data":"bdfdfcce1c2bf9bcbf7c1fdeafa9b7a419064d1e2802f8324574900e53af9f42"} Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.446111 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.446085565 podStartE2EDuration="5.446085565s" podCreationTimestamp="2025-10-07 13:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:52:14.441905627 +0000 UTC m=+3086.602628304" watchObservedRunningTime="2025-10-07 13:52:14.446085565 +0000 UTC m=+3086.606808242" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.468715 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.143728217 podStartE2EDuration="11.468701525s" podCreationTimestamp="2025-10-07 13:52:03 +0000 UTC" firstStartedPulling="2025-10-07 13:52:04.661753064 +0000 UTC m=+3076.822475741" lastFinishedPulling="2025-10-07 13:52:11.986726352 +0000 UTC m=+3084.147449049" observedRunningTime="2025-10-07 13:52:14.467024658 +0000 UTC m=+3086.627747345" watchObservedRunningTime="2025-10-07 13:52:14.468701525 +0000 UTC m=+3086.629424202" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.593009 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.722755 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.722812 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.722879 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.722927 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvngv\" (UniqueName: \"kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.723041 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.723081 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc\") pod \"849a94c5-f31f-4068-b47c-fb1163b6afc0\" (UID: \"849a94c5-f31f-4068-b47c-fb1163b6afc0\") " Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.735852 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv" (OuterVolumeSpecName: "kube-api-access-wvngv") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "kube-api-access-wvngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.831486 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvngv\" (UniqueName: \"kubernetes.io/projected/849a94c5-f31f-4068-b47c-fb1163b6afc0-kube-api-access-wvngv\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.839315 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f262c7-0697-441b-ab95-f543fdfbde54" path="/var/lib/kubelet/pods/a5f262c7-0697-441b-ab95-f543fdfbde54/volumes" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.840456 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e1f006-c462-46be-b191-aab45dd3d1b7" path="/var/lib/kubelet/pods/e5e1f006-c462-46be-b191-aab45dd3d1b7/volumes" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.848977 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.870715 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.886138 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.886321 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.901259 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config" (OuterVolumeSpecName: "config") pod "849a94c5-f31f-4068-b47c-fb1163b6afc0" (UID: "849a94c5-f31f-4068-b47c-fb1163b6afc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.933105 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.933133 4959 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.933143 4959 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.933152 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:14 crc kubenswrapper[4959]: I1007 13:52:14.933161 4959 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849a94c5-f31f-4068-b47c-fb1163b6afc0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.448210 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerStarted","Data":"c4f2c4a4c849494a10aad3b6d179f050247783dba0b44d6195b224b12395b5cf"} Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.450575 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" event={"ID":"849a94c5-f31f-4068-b47c-fb1163b6afc0","Type":"ContainerDied","Data":"5468eec3d989422174f56a4a6b96cf2daabe7eab56b4c67367ce6c4bd49d1364"} Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.450589 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5d87575-rj7fd" Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.450655 4959 scope.go:117] "RemoveContainer" containerID="bdfdfcce1c2bf9bcbf7c1fdeafa9b7a419064d1e2802f8324574900e53af9f42" Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.482156 4959 scope.go:117] "RemoveContainer" containerID="a3302a832e96814f3ca61c24a777c45b99e45237ec86c4511054e9d06b5ada4c" Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.508983 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:52:15 crc kubenswrapper[4959]: I1007 13:52:15.519290 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f5d87575-rj7fd"] Oct 07 13:52:16 crc kubenswrapper[4959]: I1007 13:52:16.467664 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerStarted","Data":"47664d04910bb272f0b27da6fbdc57ccfa01e14ad8e34ebd4def6fd578b0296d"} Oct 07 13:52:16 crc kubenswrapper[4959]: I1007 13:52:16.809112 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:52:16 crc kubenswrapper[4959]: E1007 13:52:16.809668 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:52:16 crc kubenswrapper[4959]: I1007 13:52:16.821224 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" path="/var/lib/kubelet/pods/849a94c5-f31f-4068-b47c-fb1163b6afc0/volumes" Oct 07 13:52:17 crc kubenswrapper[4959]: I1007 13:52:17.479291 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerStarted","Data":"100c6c6889c77bb15e5f7b670047cd2fdb486f7fc6461b68be9d9fe82df231b7"} Oct 07 13:52:17 crc kubenswrapper[4959]: I1007 13:52:17.602281 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.490540 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerStarted","Data":"a43b64e9d04069ee20162041f28777d07a3f18ae8b3af3832238d178448a8d4f"} Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.490821 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-central-agent" containerID="cri-o://c4f2c4a4c849494a10aad3b6d179f050247783dba0b44d6195b224b12395b5cf" gracePeriod=30 Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.490996 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.491016 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="proxy-httpd" containerID="cri-o://a43b64e9d04069ee20162041f28777d07a3f18ae8b3af3832238d178448a8d4f" gracePeriod=30 Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.491079 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="sg-core" containerID="cri-o://100c6c6889c77bb15e5f7b670047cd2fdb486f7fc6461b68be9d9fe82df231b7" gracePeriod=30 Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.491121 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-notification-agent" containerID="cri-o://47664d04910bb272f0b27da6fbdc57ccfa01e14ad8e34ebd4def6fd578b0296d" gracePeriod=30 Oct 07 13:52:18 crc kubenswrapper[4959]: I1007 13:52:18.520504 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6792293630000001 podStartE2EDuration="5.520485641s" podCreationTimestamp="2025-10-07 13:52:13 +0000 UTC" firstStartedPulling="2025-10-07 13:52:14.332490701 +0000 UTC m=+3086.493213378" lastFinishedPulling="2025-10-07 13:52:18.173746979 +0000 UTC m=+3090.334469656" observedRunningTime="2025-10-07 13:52:18.519007669 +0000 UTC m=+3090.679730356" watchObservedRunningTime="2025-10-07 13:52:18.520485641 +0000 UTC m=+3090.681208318" Oct 07 13:52:19 crc kubenswrapper[4959]: I1007 13:52:19.512865 4959 generic.go:334] "Generic (PLEG): container finished" podID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerID="100c6c6889c77bb15e5f7b670047cd2fdb486f7fc6461b68be9d9fe82df231b7" exitCode=2 Oct 07 13:52:19 crc kubenswrapper[4959]: I1007 13:52:19.513215 4959 generic.go:334] "Generic (PLEG): container finished" podID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerID="47664d04910bb272f0b27da6fbdc57ccfa01e14ad8e34ebd4def6fd578b0296d" exitCode=0 Oct 07 13:52:19 crc kubenswrapper[4959]: I1007 13:52:19.513048 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerDied","Data":"100c6c6889c77bb15e5f7b670047cd2fdb486f7fc6461b68be9d9fe82df231b7"} Oct 07 13:52:19 crc kubenswrapper[4959]: I1007 13:52:19.513257 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerDied","Data":"47664d04910bb272f0b27da6fbdc57ccfa01e14ad8e34ebd4def6fd578b0296d"} Oct 07 13:52:20 crc kubenswrapper[4959]: I1007 13:52:20.523408 4959 generic.go:334] "Generic (PLEG): container finished" podID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerID="c4f2c4a4c849494a10aad3b6d179f050247783dba0b44d6195b224b12395b5cf" exitCode=0 Oct 07 13:52:20 crc kubenswrapper[4959]: I1007 13:52:20.523456 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerDied","Data":"c4f2c4a4c849494a10aad3b6d179f050247783dba0b44d6195b224b12395b5cf"} Oct 07 13:52:23 crc kubenswrapper[4959]: I1007 13:52:23.904979 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.403965 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.444326 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.446151 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.498795 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.562889 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="manila-share" containerID="cri-o://ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" gracePeriod=30 Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.562951 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="probe" containerID="cri-o://75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" gracePeriod=30 Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.563069 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="manila-scheduler" containerID="cri-o://20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f" gracePeriod=30 Oct 07 13:52:25 crc kubenswrapper[4959]: I1007 13:52:25.563096 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="probe" containerID="cri-o://5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b" gracePeriod=30 Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.547752 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576185 4959 generic.go:334] "Generic (PLEG): container finished" podID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerID="75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" exitCode=0 Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576213 4959 generic.go:334] "Generic (PLEG): container finished" podID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerID="ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" exitCode=1 Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576264 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerDied","Data":"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9"} Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576290 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerDied","Data":"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735"} Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576300 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"346eefdf-f870-4709-8f2e-6c26492cb22a","Type":"ContainerDied","Data":"08ef13277420b7168b6e8cedfbd4ecce267c692f35f163419d5028a738cfc263"} Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576311 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.576327 4959 scope.go:117] "RemoveContainer" containerID="75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.590342 4959 generic.go:334] "Generic (PLEG): container finished" podID="86164cd8-7757-41e8-bc84-a7503528ee47" containerID="5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b" exitCode=0 Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.590397 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerDied","Data":"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b"} Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.597177 4959 scope.go:117] "RemoveContainer" containerID="ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.618969 4959 scope.go:117] "RemoveContainer" containerID="75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.619504 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9\": container with ID starting with 75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9 not found: ID does not exist" containerID="75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.619544 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9"} err="failed to get container status \"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9\": rpc error: code = NotFound desc = could not find container \"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9\": container with ID starting with 75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9 not found: ID does not exist" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.619571 4959 scope.go:117] "RemoveContainer" containerID="ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.620036 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735\": container with ID starting with ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735 not found: ID does not exist" containerID="ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.620074 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735"} err="failed to get container status \"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735\": rpc error: code = NotFound desc = could not find container \"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735\": container with ID starting with ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735 not found: ID does not exist" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.620103 4959 scope.go:117] "RemoveContainer" containerID="75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.620368 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9"} err="failed to get container status \"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9\": rpc error: code = NotFound desc = could not find container \"75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9\": container with ID starting with 75650f74d6dcadc99b81a9778d3f166b300cd342f08dbfd4b3d733852c8f8cf9 not found: ID does not exist" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.620389 4959 scope.go:117] "RemoveContainer" containerID="ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.620679 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735"} err="failed to get container status \"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735\": rpc error: code = NotFound desc = could not find container \"ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735\": container with ID starting with ca730f307de00ac6d803cdf6bc2f9aae2c89ec1e72c364b48b6b147627c3f735 not found: ID does not exist" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.652647 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.653771 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.653869 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.653902 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654041 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654109 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654164 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654204 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp4m6\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654243 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom\") pod \"346eefdf-f870-4709-8f2e-6c26492cb22a\" (UID: \"346eefdf-f870-4709-8f2e-6c26492cb22a\") " Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.654819 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.655085 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.660351 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6" (OuterVolumeSpecName: "kube-api-access-zp4m6") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "kube-api-access-zp4m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.660607 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.660782 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph" (OuterVolumeSpecName: "ceph") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.662138 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts" (OuterVolumeSpecName: "scripts") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.708128 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.747616 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data" (OuterVolumeSpecName: "config-data") pod "346eefdf-f870-4709-8f2e-6c26492cb22a" (UID: "346eefdf-f870-4709-8f2e-6c26492cb22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757018 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757049 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757058 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757069 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757077 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp4m6\" (UniqueName: \"kubernetes.io/projected/346eefdf-f870-4709-8f2e-6c26492cb22a-kube-api-access-zp4m6\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757086 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346eefdf-f870-4709-8f2e-6c26492cb22a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.757094 4959 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/346eefdf-f870-4709-8f2e-6c26492cb22a-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.899302 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.910281 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.922251 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.922921 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="dnsmasq-dns" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923033 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="dnsmasq-dns" Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.923110 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="probe" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923172 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="probe" Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.923236 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="manila-share" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923298 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="manila-share" Oct 07 13:52:26 crc kubenswrapper[4959]: E1007 13:52:26.923411 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="init" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923478 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="init" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923773 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="849a94c5-f31f-4068-b47c-fb1163b6afc0" containerName="dnsmasq-dns" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923866 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="probe" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.923948 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" containerName="manila-share" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.925212 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.927156 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 07 13:52:26 crc kubenswrapper[4959]: I1007 13:52:26.930915 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.062572 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.062888 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063167 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063209 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6bjz\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-kube-api-access-z6bjz\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063347 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063412 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-scripts\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063436 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-ceph\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.063457 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.165690 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.166101 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-scripts\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.165795 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.166126 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-ceph\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.166884 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.166995 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.167101 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.167292 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.167399 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6bjz\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-kube-api-access-z6bjz\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.167758 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e08ebef-1b6b-4040-8b0f-7c841e191363-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.170414 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-ceph\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.170471 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.171106 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.172029 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-config-data\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.172781 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e08ebef-1b6b-4040-8b0f-7c841e191363-scripts\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.184993 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6bjz\" (UniqueName: \"kubernetes.io/projected/8e08ebef-1b6b-4040-8b0f-7c841e191363-kube-api-access-z6bjz\") pod \"manila-share-share1-0\" (UID: \"8e08ebef-1b6b-4040-8b0f-7c841e191363\") " pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.247767 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 13:52:27 crc kubenswrapper[4959]: I1007 13:52:27.959253 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 13:52:28 crc kubenswrapper[4959]: I1007 13:52:28.622819 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8e08ebef-1b6b-4040-8b0f-7c841e191363","Type":"ContainerStarted","Data":"390f1cb7c637df2fdf20dc326a3777051231bfdb66f442b95d2d4b12ed89402b"} Oct 07 13:52:28 crc kubenswrapper[4959]: I1007 13:52:28.623157 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8e08ebef-1b6b-4040-8b0f-7c841e191363","Type":"ContainerStarted","Data":"d657d3983fdc0dff030e2b5e5a486feb56749e20d5dbbb561b9a2bc9caf66880"} Oct 07 13:52:28 crc kubenswrapper[4959]: I1007 13:52:28.818412 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:52:28 crc kubenswrapper[4959]: E1007 13:52:28.819560 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:52:28 crc kubenswrapper[4959]: I1007 13:52:28.843091 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346eefdf-f870-4709-8f2e-6c26492cb22a" path="/var/lib/kubelet/pods/346eefdf-f870-4709-8f2e-6c26492cb22a/volumes" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.470430 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624464 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624532 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624671 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qprxx\" (UniqueName: \"kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624696 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624792 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.624836 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle\") pod \"86164cd8-7757-41e8-bc84-a7503528ee47\" (UID: \"86164cd8-7757-41e8-bc84-a7503528ee47\") " Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.625765 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.626698 4959 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86164cd8-7757-41e8-bc84-a7503528ee47-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.636128 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx" (OuterVolumeSpecName: "kube-api-access-qprxx") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "kube-api-access-qprxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.636247 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts" (OuterVolumeSpecName: "scripts") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.639120 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8e08ebef-1b6b-4040-8b0f-7c841e191363","Type":"ContainerStarted","Data":"fb468560cdc08be2833818c518a9c73908592f6f19b83819b5829f13412823a6"} Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.641879 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.643441 4959 generic.go:334] "Generic (PLEG): container finished" podID="86164cd8-7757-41e8-bc84-a7503528ee47" containerID="20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f" exitCode=0 Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.643507 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.643553 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerDied","Data":"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f"} Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.643639 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"86164cd8-7757-41e8-bc84-a7503528ee47","Type":"ContainerDied","Data":"1f941d31dc02618ff4e9a1d97b19e85724f1dfc4344fdbfbba7aee25b70f59f1"} Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.643671 4959 scope.go:117] "RemoveContainer" containerID="5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.680347 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.712266 4959 scope.go:117] "RemoveContainer" containerID="20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.728243 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qprxx\" (UniqueName: \"kubernetes.io/projected/86164cd8-7757-41e8-bc84-a7503528ee47-kube-api-access-qprxx\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.728277 4959 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.728287 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.728298 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.736502 4959 scope.go:117] "RemoveContainer" containerID="5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b" Oct 07 13:52:29 crc kubenswrapper[4959]: E1007 13:52:29.737939 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b\": container with ID starting with 5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b not found: ID does not exist" containerID="5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.737983 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b"} err="failed to get container status \"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b\": rpc error: code = NotFound desc = could not find container \"5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b\": container with ID starting with 5753cb2736b53c81755c98821e664c75c65e71cb18f7ec1b1e2c03596b4ad30b not found: ID does not exist" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.738011 4959 scope.go:117] "RemoveContainer" containerID="20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f" Oct 07 13:52:29 crc kubenswrapper[4959]: E1007 13:52:29.738467 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f\": container with ID starting with 20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f not found: ID does not exist" containerID="20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.738504 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f"} err="failed to get container status \"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f\": rpc error: code = NotFound desc = could not find container \"20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f\": container with ID starting with 20e9d4df5750bc90fd848503d17b8c31859880053321f05b40a52c82e1c56c2f not found: ID does not exist" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.751097 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data" (OuterVolumeSpecName: "config-data") pod "86164cd8-7757-41e8-bc84-a7503528ee47" (UID: "86164cd8-7757-41e8-bc84-a7503528ee47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.831037 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86164cd8-7757-41e8-bc84-a7503528ee47-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.971857 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.971827385 podStartE2EDuration="3.971827385s" podCreationTimestamp="2025-10-07 13:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:52:29.670679663 +0000 UTC m=+3101.831403420" watchObservedRunningTime="2025-10-07 13:52:29.971827385 +0000 UTC m=+3102.132550062" Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.982180 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:29 crc kubenswrapper[4959]: I1007 13:52:29.990615 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.016198 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:30 crc kubenswrapper[4959]: E1007 13:52:30.017245 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="manila-scheduler" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.017273 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="manila-scheduler" Oct 07 13:52:30 crc kubenswrapper[4959]: E1007 13:52:30.017307 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="probe" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.017315 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="probe" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.017559 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="manila-scheduler" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.017596 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" containerName="probe" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.018665 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.023114 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.058721 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142259 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142420 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45ac094a-4d13-4664-94e2-149bdb7b4548-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142457 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-scripts\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142679 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142836 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wtk\" (UniqueName: \"kubernetes.io/projected/45ac094a-4d13-4664-94e2-149bdb7b4548-kube-api-access-f5wtk\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.142989 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.245761 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-scripts\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.245844 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.245885 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wtk\" (UniqueName: \"kubernetes.io/projected/45ac094a-4d13-4664-94e2-149bdb7b4548-kube-api-access-f5wtk\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.245929 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.246002 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.246069 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45ac094a-4d13-4664-94e2-149bdb7b4548-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.246177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45ac094a-4d13-4664-94e2-149bdb7b4548-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.251153 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.251439 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-scripts\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.252460 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-config-data\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.253146 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ac094a-4d13-4664-94e2-149bdb7b4548-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.264731 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wtk\" (UniqueName: \"kubernetes.io/projected/45ac094a-4d13-4664-94e2-149bdb7b4548-kube-api-access-f5wtk\") pod \"manila-scheduler-0\" (UID: \"45ac094a-4d13-4664-94e2-149bdb7b4548\") " pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.364567 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.819454 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86164cd8-7757-41e8-bc84-a7503528ee47" path="/var/lib/kubelet/pods/86164cd8-7757-41e8-bc84-a7503528ee47/volumes" Oct 07 13:52:30 crc kubenswrapper[4959]: I1007 13:52:30.845390 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 13:52:30 crc kubenswrapper[4959]: W1007 13:52:30.849065 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ac094a_4d13_4664_94e2_149bdb7b4548.slice/crio-c480252f4935da10dced7e80e19e11aa85dcc4bd6b8a606144a48508051d4ac8 WatchSource:0}: Error finding container c480252f4935da10dced7e80e19e11aa85dcc4bd6b8a606144a48508051d4ac8: Status 404 returned error can't find the container with id c480252f4935da10dced7e80e19e11aa85dcc4bd6b8a606144a48508051d4ac8 Oct 07 13:52:31 crc kubenswrapper[4959]: I1007 13:52:31.145318 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 07 13:52:31 crc kubenswrapper[4959]: I1007 13:52:31.681950 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"45ac094a-4d13-4664-94e2-149bdb7b4548","Type":"ContainerStarted","Data":"87e0b9113b0cb68756dcb9eaba3ad5c444965488f9a76bffe1eb95ca9882f710"} Oct 07 13:52:31 crc kubenswrapper[4959]: I1007 13:52:31.682338 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"45ac094a-4d13-4664-94e2-149bdb7b4548","Type":"ContainerStarted","Data":"c480252f4935da10dced7e80e19e11aa85dcc4bd6b8a606144a48508051d4ac8"} Oct 07 13:52:32 crc kubenswrapper[4959]: I1007 13:52:32.692344 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"45ac094a-4d13-4664-94e2-149bdb7b4548","Type":"ContainerStarted","Data":"fa0f347061c9b31aadf23b50d8455c19e9a81d4d4fd22a583dc44e21be9e1337"} Oct 07 13:52:32 crc kubenswrapper[4959]: I1007 13:52:32.710700 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.710678036 podStartE2EDuration="3.710678036s" podCreationTimestamp="2025-10-07 13:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:52:32.708908526 +0000 UTC m=+3104.869631213" watchObservedRunningTime="2025-10-07 13:52:32.710678036 +0000 UTC m=+3104.871400723" Oct 07 13:52:37 crc kubenswrapper[4959]: I1007 13:52:37.248547 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 07 13:52:40 crc kubenswrapper[4959]: I1007 13:52:40.365932 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 07 13:52:43 crc kubenswrapper[4959]: I1007 13:52:43.801032 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 07 13:52:44 crc kubenswrapper[4959]: I1007 13:52:44.808681 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:52:44 crc kubenswrapper[4959]: E1007 13:52:44.808993 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.761441 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.874502 4959 generic.go:334] "Generic (PLEG): container finished" podID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerID="a43b64e9d04069ee20162041f28777d07a3f18ae8b3af3832238d178448a8d4f" exitCode=137 Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.874565 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerDied","Data":"a43b64e9d04069ee20162041f28777d07a3f18ae8b3af3832238d178448a8d4f"} Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.874606 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6a19a3c-15d8-4e5e-afd0-4da62abca972","Type":"ContainerDied","Data":"da9779e927a48f62471bd502260dcb7f70ad40c75ea1ea7115ab255d18fabcfa"} Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.874619 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da9779e927a48f62471bd502260dcb7f70ad40c75ea1ea7115ab255d18fabcfa" Oct 07 13:52:48 crc kubenswrapper[4959]: I1007 13:52:48.929876 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.039680 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.039722 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2mr\" (UniqueName: \"kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.039769 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.039983 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040007 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040025 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040053 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040079 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data\") pod \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\" (UID: \"d6a19a3c-15d8-4e5e-afd0-4da62abca972\") " Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040360 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040747 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.040919 4959 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.045939 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr" (OuterVolumeSpecName: "kube-api-access-bm2mr") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "kube-api-access-bm2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.046778 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts" (OuterVolumeSpecName: "scripts") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.067999 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.104957 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.140011 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142573 4959 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6a19a3c-15d8-4e5e-afd0-4da62abca972-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142601 4959 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142611 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142633 4959 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142642 4959 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.142653 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2mr\" (UniqueName: \"kubernetes.io/projected/d6a19a3c-15d8-4e5e-afd0-4da62abca972-kube-api-access-bm2mr\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.151669 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data" (OuterVolumeSpecName: "config-data") pod "d6a19a3c-15d8-4e5e-afd0-4da62abca972" (UID: "d6a19a3c-15d8-4e5e-afd0-4da62abca972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.244538 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a19a3c-15d8-4e5e-afd0-4da62abca972-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.883161 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.916802 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.928708 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.948688 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:49 crc kubenswrapper[4959]: E1007 13:52:49.949121 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="proxy-httpd" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949133 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="proxy-httpd" Oct 07 13:52:49 crc kubenswrapper[4959]: E1007 13:52:49.949155 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-central-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949162 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-central-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: E1007 13:52:49.949176 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="sg-core" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949183 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="sg-core" Oct 07 13:52:49 crc kubenswrapper[4959]: E1007 13:52:49.949213 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-notification-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949219 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-notification-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949394 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-central-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949413 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="ceilometer-notification-agent" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949424 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="proxy-httpd" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.949436 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" containerName="sg-core" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.951088 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.954211 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.954532 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.954671 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 13:52:49 crc kubenswrapper[4959]: I1007 13:52:49.960452 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061135 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-run-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061219 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-log-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061265 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgzc\" (UniqueName: \"kubernetes.io/projected/3b0fa6d2-0f70-48bf-ba53-542df646b703-kube-api-access-6tgzc\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061323 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-config-data\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061354 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061411 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061474 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.061500 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-scripts\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163687 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-log-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163733 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgzc\" (UniqueName: \"kubernetes.io/projected/3b0fa6d2-0f70-48bf-ba53-542df646b703-kube-api-access-6tgzc\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163758 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-config-data\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163825 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163862 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163928 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.163951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-scripts\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.164040 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-run-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.164492 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-log-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.164598 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b0fa6d2-0f70-48bf-ba53-542df646b703-run-httpd\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.168791 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.170045 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.170535 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.178969 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-scripts\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.184821 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0fa6d2-0f70-48bf-ba53-542df646b703-config-data\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.187890 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgzc\" (UniqueName: \"kubernetes.io/projected/3b0fa6d2-0f70-48bf-ba53-542df646b703-kube-api-access-6tgzc\") pod \"ceilometer-0\" (UID: \"3b0fa6d2-0f70-48bf-ba53-542df646b703\") " pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.281597 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.728408 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.820136 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a19a3c-15d8-4e5e-afd0-4da62abca972" path="/var/lib/kubelet/pods/d6a19a3c-15d8-4e5e-afd0-4da62abca972/volumes" Oct 07 13:52:50 crc kubenswrapper[4959]: I1007 13:52:50.894368 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b0fa6d2-0f70-48bf-ba53-542df646b703","Type":"ContainerStarted","Data":"93d23c2b6ff22af990917bc1992eb619e78e2ee04c08795c8c42cc5b01ba7dfd"} Oct 07 13:52:52 crc kubenswrapper[4959]: I1007 13:52:52.035440 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 07 13:52:52 crc kubenswrapper[4959]: I1007 13:52:52.934262 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b0fa6d2-0f70-48bf-ba53-542df646b703","Type":"ContainerStarted","Data":"d0858a8b3591daee1face244293f5e81a3eb499c33ca2f6a72ce2edb3a332cbc"} Oct 07 13:52:53 crc kubenswrapper[4959]: I1007 13:52:53.953121 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b0fa6d2-0f70-48bf-ba53-542df646b703","Type":"ContainerStarted","Data":"dce548b22ced0577e9294f559be9e6bf9205147c9113a0e701e0c87ff42a7b8c"} Oct 07 13:52:54 crc kubenswrapper[4959]: I1007 13:52:54.972346 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b0fa6d2-0f70-48bf-ba53-542df646b703","Type":"ContainerStarted","Data":"a11388a0f6abe5fcdc7905ccc3f6a8e02506a6840eeab3888d80549719655372"} Oct 07 13:52:56 crc kubenswrapper[4959]: I1007 13:52:56.997770 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b0fa6d2-0f70-48bf-ba53-542df646b703","Type":"ContainerStarted","Data":"6eb8a9694c9c1b8c0f38c7331a2596ec8885f359e2229e540c9b5e5edcc1b7d5"} Oct 07 13:52:56 crc kubenswrapper[4959]: I1007 13:52:56.998353 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 13:52:57 crc kubenswrapper[4959]: I1007 13:52:57.018085 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.993852844 podStartE2EDuration="8.018064237s" podCreationTimestamp="2025-10-07 13:52:49 +0000 UTC" firstStartedPulling="2025-10-07 13:52:50.737878714 +0000 UTC m=+3122.898601411" lastFinishedPulling="2025-10-07 13:52:55.762090127 +0000 UTC m=+3127.922812804" observedRunningTime="2025-10-07 13:52:57.015173805 +0000 UTC m=+3129.175896502" watchObservedRunningTime="2025-10-07 13:52:57.018064237 +0000 UTC m=+3129.178786914" Oct 07 13:52:57 crc kubenswrapper[4959]: I1007 13:52:57.808732 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:52:57 crc kubenswrapper[4959]: E1007 13:52:57.809164 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:53:11 crc kubenswrapper[4959]: I1007 13:53:11.809384 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:53:11 crc kubenswrapper[4959]: E1007 13:53:11.810522 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:53:20 crc kubenswrapper[4959]: I1007 13:53:20.289220 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 13:53:22 crc kubenswrapper[4959]: I1007 13:53:22.808554 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:53:22 crc kubenswrapper[4959]: E1007 13:53:22.809103 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:53:35 crc kubenswrapper[4959]: I1007 13:53:35.809143 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:53:35 crc kubenswrapper[4959]: E1007 13:53:35.810095 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:53:48 crc kubenswrapper[4959]: I1007 13:53:48.825394 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:53:48 crc kubenswrapper[4959]: E1007 13:53:48.826712 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:54:00 crc kubenswrapper[4959]: I1007 13:54:00.808718 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:54:00 crc kubenswrapper[4959]: E1007 13:54:00.809505 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:54:14 crc kubenswrapper[4959]: I1007 13:54:14.808911 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:54:14 crc kubenswrapper[4959]: E1007 13:54:14.809773 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.265322 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr"] Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.267299 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.299411 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr"] Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.405085 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9tpm\" (UniqueName: \"kubernetes.io/projected/88b5404c-1e9b-42c9-9c21-fb32b136db86-kube-api-access-z9tpm\") pod \"openstack-operator-controller-operator-86c7c896d7-mzwlr\" (UID: \"88b5404c-1e9b-42c9-9c21-fb32b136db86\") " pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.507570 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9tpm\" (UniqueName: \"kubernetes.io/projected/88b5404c-1e9b-42c9-9c21-fb32b136db86-kube-api-access-z9tpm\") pod \"openstack-operator-controller-operator-86c7c896d7-mzwlr\" (UID: \"88b5404c-1e9b-42c9-9c21-fb32b136db86\") " pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.528175 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9tpm\" (UniqueName: \"kubernetes.io/projected/88b5404c-1e9b-42c9-9c21-fb32b136db86-kube-api-access-z9tpm\") pod \"openstack-operator-controller-operator-86c7c896d7-mzwlr\" (UID: \"88b5404c-1e9b-42c9-9c21-fb32b136db86\") " pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:18 crc kubenswrapper[4959]: I1007 13:54:18.588503 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.070211 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr"] Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.807539 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" event={"ID":"88b5404c-1e9b-42c9-9c21-fb32b136db86","Type":"ContainerStarted","Data":"8170c7c1498b06ffefcbce8f19c93d9587f3e1e8ff840cf23dea0a884dbd0a15"} Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.807871 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" event={"ID":"88b5404c-1e9b-42c9-9c21-fb32b136db86","Type":"ContainerStarted","Data":"0e00fefa35b5b81ede1a269b7f470c1b7cbc495852ddd06c0917af105715db16"} Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.807882 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" event={"ID":"88b5404c-1e9b-42c9-9c21-fb32b136db86","Type":"ContainerStarted","Data":"eba97f2b45ee0555d5b7836cd71c43367af05e0108dcb7c0d24e560c1710b928"} Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.807905 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:19 crc kubenswrapper[4959]: I1007 13:54:19.845347 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" podStartSLOduration=1.845327681 podStartE2EDuration="1.845327681s" podCreationTimestamp="2025-10-07 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:54:19.835920655 +0000 UTC m=+3211.996643342" watchObservedRunningTime="2025-10-07 13:54:19.845327681 +0000 UTC m=+3212.006050348" Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.591833 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-86c7c896d7-mzwlr" Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.676136 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.676367 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="operator" containerID="cri-o://e1923992d91364cb7b808758e020bed32bb1549bfd5c69e361b20a3941d70152" gracePeriod=10 Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.676481 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="kube-rbac-proxy" containerID="cri-o://9bddedf47315e60b3586b3fc3e19d26c00edca23523d4c8be1280f3967cf4ffd" gracePeriod=10 Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.886440 4959 generic.go:334] "Generic (PLEG): container finished" podID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerID="9bddedf47315e60b3586b3fc3e19d26c00edca23523d4c8be1280f3967cf4ffd" exitCode=0 Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.886905 4959 generic.go:334] "Generic (PLEG): container finished" podID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerID="e1923992d91364cb7b808758e020bed32bb1549bfd5c69e361b20a3941d70152" exitCode=0 Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.886932 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerDied","Data":"9bddedf47315e60b3586b3fc3e19d26c00edca23523d4c8be1280f3967cf4ffd"} Oct 07 13:54:28 crc kubenswrapper[4959]: I1007 13:54:28.886965 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerDied","Data":"e1923992d91364cb7b808758e020bed32bb1549bfd5c69e361b20a3941d70152"} Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.117185 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.211263 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5fvp\" (UniqueName: \"kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp\") pod \"2273db8c-b41b-453c-a22d-5fbb57fd2178\" (UID: \"2273db8c-b41b-453c-a22d-5fbb57fd2178\") " Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.216278 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp" (OuterVolumeSpecName: "kube-api-access-r5fvp") pod "2273db8c-b41b-453c-a22d-5fbb57fd2178" (UID: "2273db8c-b41b-453c-a22d-5fbb57fd2178"). InnerVolumeSpecName "kube-api-access-r5fvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.313452 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5fvp\" (UniqueName: \"kubernetes.io/projected/2273db8c-b41b-453c-a22d-5fbb57fd2178-kube-api-access-r5fvp\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.809407 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:54:29 crc kubenswrapper[4959]: E1007 13:54:29.809996 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.897323 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" event={"ID":"2273db8c-b41b-453c-a22d-5fbb57fd2178","Type":"ContainerDied","Data":"436b9c634db4548066c50d42e9d7d38b42caabe0dd49c03763d31f5f19b726ac"} Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.897553 4959 scope.go:117] "RemoveContainer" containerID="9bddedf47315e60b3586b3fc3e19d26c00edca23523d4c8be1280f3967cf4ffd" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.897362 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.921715 4959 scope.go:117] "RemoveContainer" containerID="e1923992d91364cb7b808758e020bed32bb1549bfd5c69e361b20a3941d70152" Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.933402 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.942290 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq"] Oct 07 13:54:29 crc kubenswrapper[4959]: I1007 13:54:29.991646 4959 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-57bc4467bb-mb7dq" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.68:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:54:30 crc kubenswrapper[4959]: I1007 13:54:30.856120 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" path="/var/lib/kubelet/pods/2273db8c-b41b-453c-a22d-5fbb57fd2178/volumes" Oct 07 13:54:44 crc kubenswrapper[4959]: I1007 13:54:44.808810 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:54:44 crc kubenswrapper[4959]: E1007 13:54:44.809782 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:54:59 crc kubenswrapper[4959]: I1007 13:54:59.809385 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:54:59 crc kubenswrapper[4959]: E1007 13:54:59.810454 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.642952 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55c6894594-6pn9s"] Oct 07 13:55:07 crc kubenswrapper[4959]: E1007 13:55:07.644145 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="kube-rbac-proxy" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.644162 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="kube-rbac-proxy" Oct 07 13:55:07 crc kubenswrapper[4959]: E1007 13:55:07.644196 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="operator" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.644203 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="operator" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.644439 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="operator" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.644471 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2273db8c-b41b-453c-a22d-5fbb57fd2178" containerName="kube-rbac-proxy" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.645702 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.650410 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55c6894594-6pn9s"] Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.722788 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2sd\" (UniqueName: \"kubernetes.io/projected/6108c0b3-e7a9-412c-9085-0eea09f342c6-kube-api-access-cz2sd\") pod \"test-operator-controller-manager-55c6894594-6pn9s\" (UID: \"6108c0b3-e7a9-412c-9085-0eea09f342c6\") " pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.826058 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2sd\" (UniqueName: \"kubernetes.io/projected/6108c0b3-e7a9-412c-9085-0eea09f342c6-kube-api-access-cz2sd\") pod \"test-operator-controller-manager-55c6894594-6pn9s\" (UID: \"6108c0b3-e7a9-412c-9085-0eea09f342c6\") " pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.850452 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2sd\" (UniqueName: \"kubernetes.io/projected/6108c0b3-e7a9-412c-9085-0eea09f342c6-kube-api-access-cz2sd\") pod \"test-operator-controller-manager-55c6894594-6pn9s\" (UID: \"6108c0b3-e7a9-412c-9085-0eea09f342c6\") " pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:07 crc kubenswrapper[4959]: I1007 13:55:07.978429 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:08 crc kubenswrapper[4959]: I1007 13:55:08.403430 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55c6894594-6pn9s"] Oct 07 13:55:09 crc kubenswrapper[4959]: I1007 13:55:09.255267 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" event={"ID":"6108c0b3-e7a9-412c-9085-0eea09f342c6","Type":"ContainerStarted","Data":"ec6f3318d6a8449025612611e8d3d9a4fdc1fa701b3df292c45090307b240ea1"} Oct 07 13:55:10 crc kubenswrapper[4959]: I1007 13:55:10.266962 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" event={"ID":"6108c0b3-e7a9-412c-9085-0eea09f342c6","Type":"ContainerStarted","Data":"ce682a018a3edc0db4c13eb6572f4fdf496f4c06194dce39c8227cab4b13c22a"} Oct 07 13:55:10 crc kubenswrapper[4959]: I1007 13:55:10.267483 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:10 crc kubenswrapper[4959]: I1007 13:55:10.267502 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" event={"ID":"6108c0b3-e7a9-412c-9085-0eea09f342c6","Type":"ContainerStarted","Data":"9244887ea05da73a20839956fcacdeb3a963de96da2307bf02fc5258acbd45bf"} Oct 07 13:55:10 crc kubenswrapper[4959]: I1007 13:55:10.290939 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" podStartSLOduration=2.351222504 podStartE2EDuration="3.290919924s" podCreationTimestamp="2025-10-07 13:55:07 +0000 UTC" firstStartedPulling="2025-10-07 13:55:08.4213475 +0000 UTC m=+3260.582070177" lastFinishedPulling="2025-10-07 13:55:09.36104488 +0000 UTC m=+3261.521767597" observedRunningTime="2025-10-07 13:55:10.28232256 +0000 UTC m=+3262.443045247" watchObservedRunningTime="2025-10-07 13:55:10.290919924 +0000 UTC m=+3262.451642601" Oct 07 13:55:10 crc kubenswrapper[4959]: I1007 13:55:10.809279 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:55:10 crc kubenswrapper[4959]: E1007 13:55:10.810194 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:55:17 crc kubenswrapper[4959]: I1007 13:55:17.983204 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55c6894594-6pn9s" Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.069293 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.069533 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="kube-rbac-proxy" containerID="cri-o://6a154f9a5e1a23a1a6ce823732131b4e151887b6ee787e1271339e73aaedcfb8" gracePeriod=10 Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.069739 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="manager" containerID="cri-o://d3a5b7268c2a2307dc1738fa93a8f05f23fe47f3e013504c9957be3f0c64e9c0" gracePeriod=10 Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.336091 4959 generic.go:334] "Generic (PLEG): container finished" podID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerID="d3a5b7268c2a2307dc1738fa93a8f05f23fe47f3e013504c9957be3f0c64e9c0" exitCode=0 Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.336547 4959 generic.go:334] "Generic (PLEG): container finished" podID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerID="6a154f9a5e1a23a1a6ce823732131b4e151887b6ee787e1271339e73aaedcfb8" exitCode=0 Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.336194 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerDied","Data":"d3a5b7268c2a2307dc1738fa93a8f05f23fe47f3e013504c9957be3f0c64e9c0"} Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.336603 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerDied","Data":"6a154f9a5e1a23a1a6ce823732131b4e151887b6ee787e1271339e73aaedcfb8"} Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.555770 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.681275 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nlg4\" (UniqueName: \"kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4\") pod \"4d7dd390-0ad9-42df-9a4f-c8804639fa3f\" (UID: \"4d7dd390-0ad9-42df-9a4f-c8804639fa3f\") " Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.689869 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4" (OuterVolumeSpecName: "kube-api-access-6nlg4") pod "4d7dd390-0ad9-42df-9a4f-c8804639fa3f" (UID: "4d7dd390-0ad9-42df-9a4f-c8804639fa3f"). InnerVolumeSpecName "kube-api-access-6nlg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:18 crc kubenswrapper[4959]: I1007 13:55:18.783993 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nlg4\" (UniqueName: \"kubernetes.io/projected/4d7dd390-0ad9-42df-9a4f-c8804639fa3f-kube-api-access-6nlg4\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.347437 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" event={"ID":"4d7dd390-0ad9-42df-9a4f-c8804639fa3f","Type":"ContainerDied","Data":"bbeefc611dc8e62e3a2d7891d1352a207535241990ed55df9aa319494a90277b"} Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.347475 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7" Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.347494 4959 scope.go:117] "RemoveContainer" containerID="d3a5b7268c2a2307dc1738fa93a8f05f23fe47f3e013504c9957be3f0c64e9c0" Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.371175 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.378328 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-wrtz7"] Oct 07 13:55:19 crc kubenswrapper[4959]: I1007 13:55:19.385954 4959 scope.go:117] "RemoveContainer" containerID="6a154f9a5e1a23a1a6ce823732131b4e151887b6ee787e1271339e73aaedcfb8" Oct 07 13:55:20 crc kubenswrapper[4959]: I1007 13:55:20.820903 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" path="/var/lib/kubelet/pods/4d7dd390-0ad9-42df-9a4f-c8804639fa3f/volumes" Oct 07 13:55:24 crc kubenswrapper[4959]: I1007 13:55:24.809906 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:55:24 crc kubenswrapper[4959]: E1007 13:55:24.810773 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:55:39 crc kubenswrapper[4959]: I1007 13:55:39.809471 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:55:39 crc kubenswrapper[4959]: E1007 13:55:39.810271 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:55:52 crc kubenswrapper[4959]: I1007 13:55:52.809481 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:55:52 crc kubenswrapper[4959]: E1007 13:55:52.813317 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:56:06 crc kubenswrapper[4959]: I1007 13:56:06.809082 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:56:06 crc kubenswrapper[4959]: E1007 13:56:06.809933 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:56:19 crc kubenswrapper[4959]: I1007 13:56:19.808849 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:56:19 crc kubenswrapper[4959]: E1007 13:56:19.809989 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:56:34 crc kubenswrapper[4959]: I1007 13:56:34.808560 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:56:34 crc kubenswrapper[4959]: E1007 13:56:34.811283 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 13:56:47 crc kubenswrapper[4959]: I1007 13:56:47.809086 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 13:56:48 crc kubenswrapper[4959]: I1007 13:56:48.125903 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9"} Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.804487 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 07 13:57:11 crc kubenswrapper[4959]: E1007 13:57:11.805608 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="manager" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.805643 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="manager" Oct 07 13:57:11 crc kubenswrapper[4959]: E1007 13:57:11.805660 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="kube-rbac-proxy" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.805670 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="kube-rbac-proxy" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.808152 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="kube-rbac-proxy" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.808191 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7dd390-0ad9-42df-9a4f-c8804639fa3f" containerName="manager" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.809798 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.813595 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bm2d9" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.813876 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.814279 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.814423 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.846980 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.849457 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.855574 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.863908 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.890795 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.890870 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.890906 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.890930 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.890984 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.891029 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.891069 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnqkc\" (UniqueName: \"kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.891091 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.891108 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.891134 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.992949 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993038 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993073 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84nnc\" (UniqueName: \"kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993116 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993144 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993174 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993210 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993239 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993270 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993319 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnqkc\" (UniqueName: \"kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993348 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993370 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993395 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993577 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.993927 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.994233 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.994495 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:11 crc kubenswrapper[4959]: I1007 13:57:11.995322 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.000614 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.001461 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.002136 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.006389 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.015374 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnqkc\" (UniqueName: \"kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.028506 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.095657 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.095726 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.095893 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84nnc\" (UniqueName: \"kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.096191 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.096200 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.115919 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84nnc\" (UniqueName: \"kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc\") pod \"certified-operators-x4ngv\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.141640 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.179124 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.725914 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.730294 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:57:12 crc kubenswrapper[4959]: W1007 13:57:12.810492 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728a7535_30be_4373_8be2_e4b0c4d6f713.slice/crio-5eab10b7b98bd2c3b65bf7a35f024f885bd65858d48f27f15cefd6b1371b4fc7 WatchSource:0}: Error finding container 5eab10b7b98bd2c3b65bf7a35f024f885bd65858d48f27f15cefd6b1371b4fc7: Status 404 returned error can't find the container with id 5eab10b7b98bd2c3b65bf7a35f024f885bd65858d48f27f15cefd6b1371b4fc7 Oct 07 13:57:12 crc kubenswrapper[4959]: I1007 13:57:12.821735 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:57:13 crc kubenswrapper[4959]: I1007 13:57:13.350310 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"b14b7636-6093-478a-945a-a512ef1935b4","Type":"ContainerStarted","Data":"7910e285370d0ef417a997332303da89f6eeeb5e48cc907ccdab1f4a656849f7"} Oct 07 13:57:13 crc kubenswrapper[4959]: I1007 13:57:13.352719 4959 generic.go:334] "Generic (PLEG): container finished" podID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerID="a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827" exitCode=0 Oct 07 13:57:13 crc kubenswrapper[4959]: I1007 13:57:13.352880 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerDied","Data":"a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827"} Oct 07 13:57:13 crc kubenswrapper[4959]: I1007 13:57:13.352919 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerStarted","Data":"5eab10b7b98bd2c3b65bf7a35f024f885bd65858d48f27f15cefd6b1371b4fc7"} Oct 07 13:57:14 crc kubenswrapper[4959]: I1007 13:57:14.365122 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerStarted","Data":"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648"} Oct 07 13:57:15 crc kubenswrapper[4959]: I1007 13:57:15.376815 4959 generic.go:334] "Generic (PLEG): container finished" podID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerID="63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648" exitCode=0 Oct 07 13:57:15 crc kubenswrapper[4959]: I1007 13:57:15.377292 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerDied","Data":"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648"} Oct 07 13:57:45 crc kubenswrapper[4959]: E1007 13:57:45.597591 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 07 13:57:45 crc kubenswrapper[4959]: E1007 13:57:45.598324 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mnqkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-full_openstack(b14b7636-6093-478a-945a-a512ef1935b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 13:57:45 crc kubenswrapper[4959]: E1007 13:57:45.599492 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="b14b7636-6093-478a-945a-a512ef1935b4" Oct 07 13:57:45 crc kubenswrapper[4959]: I1007 13:57:45.687679 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerStarted","Data":"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4"} Oct 07 13:57:45 crc kubenswrapper[4959]: E1007 13:57:45.690302 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="b14b7636-6093-478a-945a-a512ef1935b4" Oct 07 13:57:45 crc kubenswrapper[4959]: I1007 13:57:45.728175 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x4ngv" podStartSLOduration=25.686609214 podStartE2EDuration="34.728153598s" podCreationTimestamp="2025-10-07 13:57:11 +0000 UTC" firstStartedPulling="2025-10-07 13:57:13.356927393 +0000 UTC m=+3385.517650070" lastFinishedPulling="2025-10-07 13:57:22.398471777 +0000 UTC m=+3394.559194454" observedRunningTime="2025-10-07 13:57:45.720733168 +0000 UTC m=+3417.881455855" watchObservedRunningTime="2025-10-07 13:57:45.728153598 +0000 UTC m=+3417.888876275" Oct 07 13:57:52 crc kubenswrapper[4959]: I1007 13:57:52.189787 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:52 crc kubenswrapper[4959]: I1007 13:57:52.191580 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:57:53 crc kubenswrapper[4959]: I1007 13:57:53.245730 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x4ngv" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="registry-server" probeResult="failure" output=< Oct 07 13:57:53 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 13:57:53 crc kubenswrapper[4959]: > Oct 07 13:58:00 crc kubenswrapper[4959]: I1007 13:58:00.287079 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 13:58:01 crc kubenswrapper[4959]: I1007 13:58:01.821819 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"b14b7636-6093-478a-945a-a512ef1935b4","Type":"ContainerStarted","Data":"23431d0f8d90db1970972714c99df72717598486185f6c6b3e977e8e2f948b9c"} Oct 07 13:58:01 crc kubenswrapper[4959]: I1007 13:58:01.847412 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-full" podStartSLOduration=4.292743694 podStartE2EDuration="51.847392092s" podCreationTimestamp="2025-10-07 13:57:10 +0000 UTC" firstStartedPulling="2025-10-07 13:57:12.730076535 +0000 UTC m=+3384.890799212" lastFinishedPulling="2025-10-07 13:58:00.284724933 +0000 UTC m=+3432.445447610" observedRunningTime="2025-10-07 13:58:01.836938176 +0000 UTC m=+3433.997660893" watchObservedRunningTime="2025-10-07 13:58:01.847392092 +0000 UTC m=+3434.008114769" Oct 07 13:58:02 crc kubenswrapper[4959]: I1007 13:58:02.226079 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:58:02 crc kubenswrapper[4959]: I1007 13:58:02.277937 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:58:02 crc kubenswrapper[4959]: I1007 13:58:02.459885 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:58:03 crc kubenswrapper[4959]: I1007 13:58:03.839469 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x4ngv" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="registry-server" containerID="cri-o://9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4" gracePeriod=2 Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.319830 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.401654 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities\") pod \"728a7535-30be-4373-8be2-e4b0c4d6f713\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.401806 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content\") pod \"728a7535-30be-4373-8be2-e4b0c4d6f713\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.401929 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84nnc\" (UniqueName: \"kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc\") pod \"728a7535-30be-4373-8be2-e4b0c4d6f713\" (UID: \"728a7535-30be-4373-8be2-e4b0c4d6f713\") " Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.404910 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities" (OuterVolumeSpecName: "utilities") pod "728a7535-30be-4373-8be2-e4b0c4d6f713" (UID: "728a7535-30be-4373-8be2-e4b0c4d6f713"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.416077 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc" (OuterVolumeSpecName: "kube-api-access-84nnc") pod "728a7535-30be-4373-8be2-e4b0c4d6f713" (UID: "728a7535-30be-4373-8be2-e4b0c4d6f713"). InnerVolumeSpecName "kube-api-access-84nnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.505152 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.505195 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84nnc\" (UniqueName: \"kubernetes.io/projected/728a7535-30be-4373-8be2-e4b0c4d6f713-kube-api-access-84nnc\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.510746 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "728a7535-30be-4373-8be2-e4b0c4d6f713" (UID: "728a7535-30be-4373-8be2-e4b0c4d6f713"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.608211 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/728a7535-30be-4373-8be2-e4b0c4d6f713-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.857795 4959 generic.go:334] "Generic (PLEG): container finished" podID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerID="9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4" exitCode=0 Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.857909 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerDied","Data":"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4"} Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.858147 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ngv" event={"ID":"728a7535-30be-4373-8be2-e4b0c4d6f713","Type":"ContainerDied","Data":"5eab10b7b98bd2c3b65bf7a35f024f885bd65858d48f27f15cefd6b1371b4fc7"} Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.858171 4959 scope.go:117] "RemoveContainer" containerID="9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.857925 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ngv" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.884999 4959 scope.go:117] "RemoveContainer" containerID="63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.891528 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.904708 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x4ngv"] Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.931391 4959 scope.go:117] "RemoveContainer" containerID="a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.986134 4959 scope.go:117] "RemoveContainer" containerID="9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4" Oct 07 13:58:04 crc kubenswrapper[4959]: E1007 13:58:04.986573 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4\": container with ID starting with 9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4 not found: ID does not exist" containerID="9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.986609 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4"} err="failed to get container status \"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4\": rpc error: code = NotFound desc = could not find container \"9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4\": container with ID starting with 9d7731f8cac712e503e5784815ceecffcc72b69ab829488fba1d81d1db5e6cd4 not found: ID does not exist" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.986702 4959 scope.go:117] "RemoveContainer" containerID="63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648" Oct 07 13:58:04 crc kubenswrapper[4959]: E1007 13:58:04.987145 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648\": container with ID starting with 63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648 not found: ID does not exist" containerID="63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.987167 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648"} err="failed to get container status \"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648\": rpc error: code = NotFound desc = could not find container \"63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648\": container with ID starting with 63fe7799dd35a087e7e201f8694b5f23a5e2a234c4269450461ecda6463fc648 not found: ID does not exist" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.987182 4959 scope.go:117] "RemoveContainer" containerID="a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827" Oct 07 13:58:04 crc kubenswrapper[4959]: E1007 13:58:04.987385 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827\": container with ID starting with a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827 not found: ID does not exist" containerID="a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827" Oct 07 13:58:04 crc kubenswrapper[4959]: I1007 13:58:04.987410 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827"} err="failed to get container status \"a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827\": rpc error: code = NotFound desc = could not find container \"a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827\": container with ID starting with a518aa772422359906051aa83450b430aeee08aec695dc2fd45487cd1a0b5827 not found: ID does not exist" Oct 07 13:58:06 crc kubenswrapper[4959]: I1007 13:58:06.825276 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" path="/var/lib/kubelet/pods/728a7535-30be-4373-8be2-e4b0c4d6f713/volumes" Oct 07 13:58:54 crc kubenswrapper[4959]: I1007 13:58:54.771520 4959 scope.go:117] "RemoveContainer" containerID="47664d04910bb272f0b27da6fbdc57ccfa01e14ad8e34ebd4def6fd578b0296d" Oct 07 13:58:54 crc kubenswrapper[4959]: I1007 13:58:54.793752 4959 scope.go:117] "RemoveContainer" containerID="100c6c6889c77bb15e5f7b670047cd2fdb486f7fc6461b68be9d9fe82df231b7" Oct 07 13:58:54 crc kubenswrapper[4959]: I1007 13:58:54.810318 4959 scope.go:117] "RemoveContainer" containerID="c4f2c4a4c849494a10aad3b6d179f050247783dba0b44d6195b224b12395b5cf" Oct 07 13:58:54 crc kubenswrapper[4959]: I1007 13:58:54.834575 4959 scope.go:117] "RemoveContainer" containerID="a43b64e9d04069ee20162041f28777d07a3f18ae8b3af3832238d178448a8d4f" Oct 07 13:59:07 crc kubenswrapper[4959]: I1007 13:59:07.695775 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:59:07 crc kubenswrapper[4959]: I1007 13:59:07.696380 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:59:37 crc kubenswrapper[4959]: I1007 13:59:37.695855 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:59:37 crc kubenswrapper[4959]: I1007 13:59:37.696386 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.167036 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78"] Oct 07 14:00:00 crc kubenswrapper[4959]: E1007 14:00:00.167860 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="extract-utilities" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.167872 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="extract-utilities" Oct 07 14:00:00 crc kubenswrapper[4959]: E1007 14:00:00.167903 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="extract-content" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.167910 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="extract-content" Oct 07 14:00:00 crc kubenswrapper[4959]: E1007 14:00:00.167923 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="registry-server" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.167930 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="registry-server" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.168110 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a7535-30be-4373-8be2-e4b0c4d6f713" containerName="registry-server" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.168725 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.171443 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.181487 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78"] Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.190375 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.208316 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2g6j\" (UniqueName: \"kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.208404 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.208737 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.310950 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2g6j\" (UniqueName: \"kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.311025 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.311122 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.311964 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.323769 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.326967 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2g6j\" (UniqueName: \"kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j\") pod \"collect-profiles-29330760-tdd78\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.495133 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:00 crc kubenswrapper[4959]: I1007 14:00:00.971998 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78"] Oct 07 14:00:01 crc kubenswrapper[4959]: I1007 14:00:01.860070 4959 generic.go:334] "Generic (PLEG): container finished" podID="6eb9815f-4fbe-4bc0-8f97-c7c400851e18" containerID="851070a6390417fe3f4c60db43dae2c6a48d72a57960c337763e3ed148d04855" exitCode=0 Oct 07 14:00:01 crc kubenswrapper[4959]: I1007 14:00:01.860108 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" event={"ID":"6eb9815f-4fbe-4bc0-8f97-c7c400851e18","Type":"ContainerDied","Data":"851070a6390417fe3f4c60db43dae2c6a48d72a57960c337763e3ed148d04855"} Oct 07 14:00:01 crc kubenswrapper[4959]: I1007 14:00:01.860354 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" event={"ID":"6eb9815f-4fbe-4bc0-8f97-c7c400851e18","Type":"ContainerStarted","Data":"b37cde2d9f070206f2de319f050cb52dc5c964cbcb7dbeb51003bdacaeffd441"} Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.210354 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.261346 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume\") pod \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.261443 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume\") pod \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.261513 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2g6j\" (UniqueName: \"kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j\") pod \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\" (UID: \"6eb9815f-4fbe-4bc0-8f97-c7c400851e18\") " Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.262832 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume" (OuterVolumeSpecName: "config-volume") pod "6eb9815f-4fbe-4bc0-8f97-c7c400851e18" (UID: "6eb9815f-4fbe-4bc0-8f97-c7c400851e18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.267337 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6eb9815f-4fbe-4bc0-8f97-c7c400851e18" (UID: "6eb9815f-4fbe-4bc0-8f97-c7c400851e18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.273006 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j" (OuterVolumeSpecName: "kube-api-access-v2g6j") pod "6eb9815f-4fbe-4bc0-8f97-c7c400851e18" (UID: "6eb9815f-4fbe-4bc0-8f97-c7c400851e18"). InnerVolumeSpecName "kube-api-access-v2g6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.363215 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.363262 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.363275 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2g6j\" (UniqueName: \"kubernetes.io/projected/6eb9815f-4fbe-4bc0-8f97-c7c400851e18-kube-api-access-v2g6j\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.880036 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" event={"ID":"6eb9815f-4fbe-4bc0-8f97-c7c400851e18","Type":"ContainerDied","Data":"b37cde2d9f070206f2de319f050cb52dc5c964cbcb7dbeb51003bdacaeffd441"} Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.880069 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78" Oct 07 14:00:03 crc kubenswrapper[4959]: I1007 14:00:03.880085 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b37cde2d9f070206f2de319f050cb52dc5c964cbcb7dbeb51003bdacaeffd441" Oct 07 14:00:04 crc kubenswrapper[4959]: I1007 14:00:04.304112 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk"] Oct 07 14:00:04 crc kubenswrapper[4959]: I1007 14:00:04.314247 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-8vtlk"] Oct 07 14:00:04 crc kubenswrapper[4959]: I1007 14:00:04.823545 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5d4db1-1025-48c0-850d-54ac32c93f1f" path="/var/lib/kubelet/pods/aa5d4db1-1025-48c0-850d-54ac32c93f1f/volumes" Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.696191 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.696505 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.696552 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.697756 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.697861 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9" gracePeriod=600 Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.917441 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9" exitCode=0 Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.917517 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9"} Oct 07 14:00:07 crc kubenswrapper[4959]: I1007 14:00:07.917763 4959 scope.go:117] "RemoveContainer" containerID="7a26c139d4f4511c16c2c2ba826d784a3ac426fa22ba34c400b387f8d13adefe" Oct 07 14:00:08 crc kubenswrapper[4959]: I1007 14:00:08.928254 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37"} Oct 07 14:00:54 crc kubenswrapper[4959]: I1007 14:00:54.912876 4959 scope.go:117] "RemoveContainer" containerID="04ba059da8bd75f5ff275d86fe67b2a955bbee0c57b2a3a920151b8480efc867" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.149247 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330761-j7cjf"] Oct 07 14:01:00 crc kubenswrapper[4959]: E1007 14:01:00.150251 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb9815f-4fbe-4bc0-8f97-c7c400851e18" containerName="collect-profiles" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.150269 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb9815f-4fbe-4bc0-8f97-c7c400851e18" containerName="collect-profiles" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.150550 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb9815f-4fbe-4bc0-8f97-c7c400851e18" containerName="collect-profiles" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.153174 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.157897 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330761-j7cjf"] Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.269207 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.269288 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.269378 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfr5\" (UniqueName: \"kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.269453 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.371145 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfr5\" (UniqueName: \"kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.371294 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.371366 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.371434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.378009 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.379090 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.379379 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.388972 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfr5\" (UniqueName: \"kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5\") pod \"keystone-cron-29330761-j7cjf\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.490477 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:00 crc kubenswrapper[4959]: I1007 14:01:00.934423 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330761-j7cjf"] Oct 07 14:01:01 crc kubenswrapper[4959]: I1007 14:01:01.437704 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-j7cjf" event={"ID":"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40","Type":"ContainerStarted","Data":"e988949241647bfcc8452a16fd51702a76bdd1dfbf96aec4172ae4db4370730c"} Oct 07 14:01:01 crc kubenswrapper[4959]: I1007 14:01:01.438099 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-j7cjf" event={"ID":"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40","Type":"ContainerStarted","Data":"0fbf20891183f5e006d4a18b08cb7e43b3c4296fe1df7cab6f4cb8efd0be819a"} Oct 07 14:01:01 crc kubenswrapper[4959]: I1007 14:01:01.466363 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330761-j7cjf" podStartSLOduration=1.466340556 podStartE2EDuration="1.466340556s" podCreationTimestamp="2025-10-07 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:01:01.460231333 +0000 UTC m=+3613.620954030" watchObservedRunningTime="2025-10-07 14:01:01.466340556 +0000 UTC m=+3613.627063243" Oct 07 14:01:03 crc kubenswrapper[4959]: I1007 14:01:03.455837 4959 generic.go:334] "Generic (PLEG): container finished" podID="0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" containerID="e988949241647bfcc8452a16fd51702a76bdd1dfbf96aec4172ae4db4370730c" exitCode=0 Oct 07 14:01:03 crc kubenswrapper[4959]: I1007 14:01:03.456203 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-j7cjf" event={"ID":"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40","Type":"ContainerDied","Data":"e988949241647bfcc8452a16fd51702a76bdd1dfbf96aec4172ae4db4370730c"} Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.883147 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.962489 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlfr5\" (UniqueName: \"kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5\") pod \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.962540 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys\") pod \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.962606 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data\") pod \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.962793 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle\") pod \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\" (UID: \"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40\") " Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.986444 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" (UID: "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:04 crc kubenswrapper[4959]: I1007 14:01:04.992073 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5" (OuterVolumeSpecName: "kube-api-access-dlfr5") pod "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" (UID: "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40"). InnerVolumeSpecName "kube-api-access-dlfr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.012793 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" (UID: "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.047821 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data" (OuterVolumeSpecName: "config-data") pod "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" (UID: "0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.064948 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlfr5\" (UniqueName: \"kubernetes.io/projected/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-kube-api-access-dlfr5\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.064979 4959 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.064989 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.064996 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.471944 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-j7cjf" event={"ID":"0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40","Type":"ContainerDied","Data":"0fbf20891183f5e006d4a18b08cb7e43b3c4296fe1df7cab6f4cb8efd0be819a"} Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.471984 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fbf20891183f5e006d4a18b08cb7e43b3c4296fe1df7cab6f4cb8efd0be819a" Oct 07 14:01:05 crc kubenswrapper[4959]: I1007 14:01:05.472143 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-j7cjf" Oct 07 14:01:30 crc kubenswrapper[4959]: I1007 14:01:30.046998 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-svqh8"] Oct 07 14:01:30 crc kubenswrapper[4959]: I1007 14:01:30.056460 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-svqh8"] Oct 07 14:01:30 crc kubenswrapper[4959]: I1007 14:01:30.822366 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d44e226-eb49-498b-a3c3-4fef79b4123e" path="/var/lib/kubelet/pods/9d44e226-eb49-498b-a3c3-4fef79b4123e/volumes" Oct 07 14:01:40 crc kubenswrapper[4959]: I1007 14:01:40.040169 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3bfb-account-create-kcln7"] Oct 07 14:01:40 crc kubenswrapper[4959]: I1007 14:01:40.048836 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3bfb-account-create-kcln7"] Oct 07 14:01:40 crc kubenswrapper[4959]: I1007 14:01:40.818848 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb1aef0-7687-47f6-a79b-515a5d4d6791" path="/var/lib/kubelet/pods/7fb1aef0-7687-47f6-a79b-515a5d4d6791/volumes" Oct 07 14:01:54 crc kubenswrapper[4959]: I1007 14:01:54.973013 4959 scope.go:117] "RemoveContainer" containerID="d0492eb4bbfff77a4f362e04d75722eef0005fc9e354f764a9f28800cd4e9d89" Oct 07 14:01:54 crc kubenswrapper[4959]: I1007 14:01:54.995266 4959 scope.go:117] "RemoveContainer" containerID="bd4fae6001249ef6a531d909cef8b8ba89b596272f0158d0632c2ba1694c62da" Oct 07 14:02:03 crc kubenswrapper[4959]: I1007 14:02:03.039926 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-wwhr6"] Oct 07 14:02:03 crc kubenswrapper[4959]: I1007 14:02:03.053976 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-wwhr6"] Oct 07 14:02:04 crc kubenswrapper[4959]: I1007 14:02:04.819694 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f40be374-6671-451c-a271-163847256266" path="/var/lib/kubelet/pods/f40be374-6671-451c-a271-163847256266/volumes" Oct 07 14:02:07 crc kubenswrapper[4959]: I1007 14:02:07.695174 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:02:07 crc kubenswrapper[4959]: I1007 14:02:07.695505 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:02:37 crc kubenswrapper[4959]: I1007 14:02:37.695390 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:02:37 crc kubenswrapper[4959]: I1007 14:02:37.695906 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:02:55 crc kubenswrapper[4959]: I1007 14:02:55.092217 4959 scope.go:117] "RemoveContainer" containerID="f15e79bc5c4b138c559fd8beeab050a72a723fec38087cc426049a56d8d9e04f" Oct 07 14:03:07 crc kubenswrapper[4959]: I1007 14:03:07.697099 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:03:07 crc kubenswrapper[4959]: I1007 14:03:07.697692 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:03:07 crc kubenswrapper[4959]: I1007 14:03:07.697733 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:03:07 crc kubenswrapper[4959]: I1007 14:03:07.698429 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:03:07 crc kubenswrapper[4959]: I1007 14:03:07.698474 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" gracePeriod=600 Oct 07 14:03:07 crc kubenswrapper[4959]: E1007 14:03:07.823432 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:03:08 crc kubenswrapper[4959]: I1007 14:03:08.518643 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" exitCode=0 Oct 07 14:03:08 crc kubenswrapper[4959]: I1007 14:03:08.518649 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37"} Oct 07 14:03:08 crc kubenswrapper[4959]: I1007 14:03:08.519000 4959 scope.go:117] "RemoveContainer" containerID="2ff27817fd67efa886d22b5481e5b4f78c6042f02cfd058c0ade04e6527315c9" Oct 07 14:03:08 crc kubenswrapper[4959]: I1007 14:03:08.519681 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:03:08 crc kubenswrapper[4959]: E1007 14:03:08.519998 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.096009 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:11 crc kubenswrapper[4959]: E1007 14:03:11.097120 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" containerName="keystone-cron" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.097137 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" containerName="keystone-cron" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.097371 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40" containerName="keystone-cron" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.099150 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.106245 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.236499 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9pp\" (UniqueName: \"kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.236578 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.236620 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.338277 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9pp\" (UniqueName: \"kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.338433 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.338518 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.339251 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.339316 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.371318 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9pp\" (UniqueName: \"kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp\") pod \"community-operators-9tsjm\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.439941 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:11 crc kubenswrapper[4959]: I1007 14:03:11.997145 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:12 crc kubenswrapper[4959]: I1007 14:03:12.591121 4959 generic.go:334] "Generic (PLEG): container finished" podID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerID="ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac" exitCode=0 Oct 07 14:03:12 crc kubenswrapper[4959]: I1007 14:03:12.591431 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerDied","Data":"ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac"} Oct 07 14:03:12 crc kubenswrapper[4959]: I1007 14:03:12.591465 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerStarted","Data":"b544216bb1b2a39c13978691db1285707a30e2408d5692865345a4898e819b1b"} Oct 07 14:03:12 crc kubenswrapper[4959]: I1007 14:03:12.593343 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:03:13 crc kubenswrapper[4959]: I1007 14:03:13.611046 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerStarted","Data":"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca"} Oct 07 14:03:14 crc kubenswrapper[4959]: I1007 14:03:14.623066 4959 generic.go:334] "Generic (PLEG): container finished" podID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerID="8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca" exitCode=0 Oct 07 14:03:14 crc kubenswrapper[4959]: I1007 14:03:14.623145 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerDied","Data":"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca"} Oct 07 14:03:15 crc kubenswrapper[4959]: I1007 14:03:15.635045 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerStarted","Data":"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e"} Oct 07 14:03:15 crc kubenswrapper[4959]: I1007 14:03:15.654892 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tsjm" podStartSLOduration=2.143474082 podStartE2EDuration="4.654870789s" podCreationTimestamp="2025-10-07 14:03:11 +0000 UTC" firstStartedPulling="2025-10-07 14:03:12.593110649 +0000 UTC m=+3744.753833326" lastFinishedPulling="2025-10-07 14:03:15.104507356 +0000 UTC m=+3747.265230033" observedRunningTime="2025-10-07 14:03:15.653242833 +0000 UTC m=+3747.813965510" watchObservedRunningTime="2025-10-07 14:03:15.654870789 +0000 UTC m=+3747.815593476" Oct 07 14:03:19 crc kubenswrapper[4959]: I1007 14:03:19.809559 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:03:19 crc kubenswrapper[4959]: E1007 14:03:19.810219 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:03:21 crc kubenswrapper[4959]: I1007 14:03:21.440349 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:21 crc kubenswrapper[4959]: I1007 14:03:21.442164 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:21 crc kubenswrapper[4959]: I1007 14:03:21.509561 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:21 crc kubenswrapper[4959]: I1007 14:03:21.746907 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:21 crc kubenswrapper[4959]: I1007 14:03:21.793778 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:23 crc kubenswrapper[4959]: I1007 14:03:23.718158 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tsjm" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="registry-server" containerID="cri-o://a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e" gracePeriod=2 Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.554644 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.702343 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities\") pod \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.702419 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content\") pod \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.702712 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9pp\" (UniqueName: \"kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp\") pod \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\" (UID: \"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7\") " Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.703871 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities" (OuterVolumeSpecName: "utilities") pod "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" (UID: "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.725605 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp" (OuterVolumeSpecName: "kube-api-access-5l9pp") pod "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" (UID: "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7"). InnerVolumeSpecName "kube-api-access-5l9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.735172 4959 generic.go:334] "Generic (PLEG): container finished" podID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerID="a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e" exitCode=0 Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.735226 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerDied","Data":"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e"} Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.735255 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tsjm" event={"ID":"8a7eb50f-3655-47f6-84a2-cbd0e4d49af7","Type":"ContainerDied","Data":"b544216bb1b2a39c13978691db1285707a30e2408d5692865345a4898e819b1b"} Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.735273 4959 scope.go:117] "RemoveContainer" containerID="a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.735406 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tsjm" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.786361 4959 scope.go:117] "RemoveContainer" containerID="8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.800968 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" (UID: "8a7eb50f-3655-47f6-84a2-cbd0e4d49af7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.805212 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9pp\" (UniqueName: \"kubernetes.io/projected/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-kube-api-access-5l9pp\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.805250 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.805264 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.819504 4959 scope.go:117] "RemoveContainer" containerID="ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.869435 4959 scope.go:117] "RemoveContainer" containerID="a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e" Oct 07 14:03:24 crc kubenswrapper[4959]: E1007 14:03:24.872945 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e\": container with ID starting with a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e not found: ID does not exist" containerID="a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.873010 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e"} err="failed to get container status \"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e\": rpc error: code = NotFound desc = could not find container \"a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e\": container with ID starting with a430d81ff0c3969a7acddc9fcfb2a25fe3bc91c05d1548f78163695109c5433e not found: ID does not exist" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.873047 4959 scope.go:117] "RemoveContainer" containerID="8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca" Oct 07 14:03:24 crc kubenswrapper[4959]: E1007 14:03:24.873385 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca\": container with ID starting with 8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca not found: ID does not exist" containerID="8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.873434 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca"} err="failed to get container status \"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca\": rpc error: code = NotFound desc = could not find container \"8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca\": container with ID starting with 8abb1e3153a74a696baab8c0e890dbd45826a59710630406948f9b7cb08276ca not found: ID does not exist" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.873470 4959 scope.go:117] "RemoveContainer" containerID="ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac" Oct 07 14:03:24 crc kubenswrapper[4959]: E1007 14:03:24.878773 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac\": container with ID starting with ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac not found: ID does not exist" containerID="ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac" Oct 07 14:03:24 crc kubenswrapper[4959]: I1007 14:03:24.878821 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac"} err="failed to get container status \"ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac\": rpc error: code = NotFound desc = could not find container \"ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac\": container with ID starting with ecbea1502b22674e4fe2e5db1e2fce1f07df6f107ec7fb89d6dbb9846179b8ac not found: ID does not exist" Oct 07 14:03:25 crc kubenswrapper[4959]: I1007 14:03:25.061184 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:25 crc kubenswrapper[4959]: I1007 14:03:25.072369 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tsjm"] Oct 07 14:03:26 crc kubenswrapper[4959]: I1007 14:03:26.822988 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" path="/var/lib/kubelet/pods/8a7eb50f-3655-47f6-84a2-cbd0e4d49af7/volumes" Oct 07 14:03:34 crc kubenswrapper[4959]: I1007 14:03:34.809312 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:03:34 crc kubenswrapper[4959]: E1007 14:03:34.810143 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:03:49 crc kubenswrapper[4959]: I1007 14:03:49.809119 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:03:49 crc kubenswrapper[4959]: E1007 14:03:49.809777 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:00 crc kubenswrapper[4959]: I1007 14:04:00.808304 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:00 crc kubenswrapper[4959]: E1007 14:04:00.809191 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:11 crc kubenswrapper[4959]: I1007 14:04:11.809102 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:11 crc kubenswrapper[4959]: E1007 14:04:11.810011 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:24 crc kubenswrapper[4959]: I1007 14:04:24.809061 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:24 crc kubenswrapper[4959]: E1007 14:04:24.809819 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:37 crc kubenswrapper[4959]: I1007 14:04:37.810840 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:37 crc kubenswrapper[4959]: E1007 14:04:37.812802 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:48 crc kubenswrapper[4959]: I1007 14:04:48.815859 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:48 crc kubenswrapper[4959]: E1007 14:04:48.816499 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:04:59 crc kubenswrapper[4959]: I1007 14:04:59.817552 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:04:59 crc kubenswrapper[4959]: E1007 14:04:59.842187 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:05:12 crc kubenswrapper[4959]: I1007 14:05:12.809639 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:05:12 crc kubenswrapper[4959]: E1007 14:05:12.811930 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:05:24 crc kubenswrapper[4959]: I1007 14:05:24.808521 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:05:24 crc kubenswrapper[4959]: E1007 14:05:24.809090 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:05:37 crc kubenswrapper[4959]: I1007 14:05:37.809000 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:05:37 crc kubenswrapper[4959]: E1007 14:05:37.809814 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:05:49 crc kubenswrapper[4959]: I1007 14:05:49.809438 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:05:49 crc kubenswrapper[4959]: E1007 14:05:49.810135 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:06:04 crc kubenswrapper[4959]: I1007 14:06:04.808767 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:06:04 crc kubenswrapper[4959]: E1007 14:06:04.809584 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:06:19 crc kubenswrapper[4959]: I1007 14:06:19.810263 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:06:19 crc kubenswrapper[4959]: E1007 14:06:19.811037 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:06:34 crc kubenswrapper[4959]: I1007 14:06:34.809252 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:06:34 crc kubenswrapper[4959]: E1007 14:06:34.809982 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:06:47 crc kubenswrapper[4959]: I1007 14:06:47.808760 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:06:47 crc kubenswrapper[4959]: E1007 14:06:47.809895 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:02 crc kubenswrapper[4959]: I1007 14:07:02.809400 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:02 crc kubenswrapper[4959]: E1007 14:07:02.810219 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:13 crc kubenswrapper[4959]: I1007 14:07:13.809115 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:13 crc kubenswrapper[4959]: E1007 14:07:13.809995 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:24 crc kubenswrapper[4959]: I1007 14:07:24.809979 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:24 crc kubenswrapper[4959]: E1007 14:07:24.810684 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.888793 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:27 crc kubenswrapper[4959]: E1007 14:07:27.890050 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="extract-content" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.890071 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="extract-content" Oct 07 14:07:27 crc kubenswrapper[4959]: E1007 14:07:27.890090 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="registry-server" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.890098 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="registry-server" Oct 07 14:07:27 crc kubenswrapper[4959]: E1007 14:07:27.890111 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="extract-utilities" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.890120 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="extract-utilities" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.890388 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a7eb50f-3655-47f6-84a2-cbd0e4d49af7" containerName="registry-server" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.892419 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.913740 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.928279 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.928363 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:27 crc kubenswrapper[4959]: I1007 14:07:27.928382 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjw2l\" (UniqueName: \"kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.030022 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.030147 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.030182 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjw2l\" (UniqueName: \"kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.030790 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.030793 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.053623 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjw2l\" (UniqueName: \"kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l\") pod \"certified-operators-7n5xl\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.215760 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:28 crc kubenswrapper[4959]: I1007 14:07:28.913294 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:29 crc kubenswrapper[4959]: I1007 14:07:29.791799 4959 generic.go:334] "Generic (PLEG): container finished" podID="b0496464-69b6-47df-8658-25e65c392668" containerID="6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea" exitCode=0 Oct 07 14:07:29 crc kubenswrapper[4959]: I1007 14:07:29.791842 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerDied","Data":"6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea"} Oct 07 14:07:29 crc kubenswrapper[4959]: I1007 14:07:29.792086 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerStarted","Data":"164854a229eb81a6322dc1d4451c9931f06f937b5461df0c4147c79fffe666fd"} Oct 07 14:07:31 crc kubenswrapper[4959]: I1007 14:07:31.821382 4959 generic.go:334] "Generic (PLEG): container finished" podID="b0496464-69b6-47df-8658-25e65c392668" containerID="7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736" exitCode=0 Oct 07 14:07:31 crc kubenswrapper[4959]: I1007 14:07:31.822023 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerDied","Data":"7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736"} Oct 07 14:07:33 crc kubenswrapper[4959]: I1007 14:07:33.840764 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerStarted","Data":"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f"} Oct 07 14:07:33 crc kubenswrapper[4959]: I1007 14:07:33.863910 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7n5xl" podStartSLOduration=4.005617663 podStartE2EDuration="6.863886366s" podCreationTimestamp="2025-10-07 14:07:27 +0000 UTC" firstStartedPulling="2025-10-07 14:07:29.793342472 +0000 UTC m=+4001.954065149" lastFinishedPulling="2025-10-07 14:07:32.651611175 +0000 UTC m=+4004.812333852" observedRunningTime="2025-10-07 14:07:33.860224921 +0000 UTC m=+4006.020947618" watchObservedRunningTime="2025-10-07 14:07:33.863886366 +0000 UTC m=+4006.024609043" Oct 07 14:07:35 crc kubenswrapper[4959]: I1007 14:07:35.809441 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:35 crc kubenswrapper[4959]: E1007 14:07:35.809974 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:38 crc kubenswrapper[4959]: I1007 14:07:38.216338 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:38 crc kubenswrapper[4959]: I1007 14:07:38.216976 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:38 crc kubenswrapper[4959]: I1007 14:07:38.265911 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:38 crc kubenswrapper[4959]: I1007 14:07:38.934581 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:38 crc kubenswrapper[4959]: I1007 14:07:38.984798 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:40 crc kubenswrapper[4959]: I1007 14:07:40.896426 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7n5xl" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="registry-server" containerID="cri-o://797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f" gracePeriod=2 Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.575869 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.625834 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content\") pod \"b0496464-69b6-47df-8658-25e65c392668\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.626058 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjw2l\" (UniqueName: \"kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l\") pod \"b0496464-69b6-47df-8658-25e65c392668\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.626094 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities\") pod \"b0496464-69b6-47df-8658-25e65c392668\" (UID: \"b0496464-69b6-47df-8658-25e65c392668\") " Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.627262 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities" (OuterVolumeSpecName: "utilities") pod "b0496464-69b6-47df-8658-25e65c392668" (UID: "b0496464-69b6-47df-8658-25e65c392668"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.638316 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l" (OuterVolumeSpecName: "kube-api-access-pjw2l") pod "b0496464-69b6-47df-8658-25e65c392668" (UID: "b0496464-69b6-47df-8658-25e65c392668"). InnerVolumeSpecName "kube-api-access-pjw2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.684787 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0496464-69b6-47df-8658-25e65c392668" (UID: "b0496464-69b6-47df-8658-25e65c392668"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.728752 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjw2l\" (UniqueName: \"kubernetes.io/projected/b0496464-69b6-47df-8658-25e65c392668-kube-api-access-pjw2l\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.728786 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.728795 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0496464-69b6-47df-8658-25e65c392668-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.907587 4959 generic.go:334] "Generic (PLEG): container finished" podID="b0496464-69b6-47df-8658-25e65c392668" containerID="797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f" exitCode=0 Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.907776 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerDied","Data":"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f"} Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.908495 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7n5xl" event={"ID":"b0496464-69b6-47df-8658-25e65c392668","Type":"ContainerDied","Data":"164854a229eb81a6322dc1d4451c9931f06f937b5461df0c4147c79fffe666fd"} Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.908613 4959 scope.go:117] "RemoveContainer" containerID="797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.907923 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7n5xl" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.933109 4959 scope.go:117] "RemoveContainer" containerID="7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.944752 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.952687 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7n5xl"] Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.968574 4959 scope.go:117] "RemoveContainer" containerID="6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.997826 4959 scope.go:117] "RemoveContainer" containerID="797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f" Oct 07 14:07:41 crc kubenswrapper[4959]: E1007 14:07:41.998526 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f\": container with ID starting with 797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f not found: ID does not exist" containerID="797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.998576 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f"} err="failed to get container status \"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f\": rpc error: code = NotFound desc = could not find container \"797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f\": container with ID starting with 797524666c65f0a7aa74452a8538849dc9719d621f6182956ab6a11e7a2d9e5f not found: ID does not exist" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.998606 4959 scope.go:117] "RemoveContainer" containerID="7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736" Oct 07 14:07:41 crc kubenswrapper[4959]: E1007 14:07:41.999026 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736\": container with ID starting with 7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736 not found: ID does not exist" containerID="7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.999060 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736"} err="failed to get container status \"7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736\": rpc error: code = NotFound desc = could not find container \"7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736\": container with ID starting with 7155f038218489112dcfc5fa03f15c2703ca9f48e6b8468f00bb6ba3f3dda736 not found: ID does not exist" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.999092 4959 scope.go:117] "RemoveContainer" containerID="6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea" Oct 07 14:07:41 crc kubenswrapper[4959]: E1007 14:07:41.999435 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea\": container with ID starting with 6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea not found: ID does not exist" containerID="6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea" Oct 07 14:07:41 crc kubenswrapper[4959]: I1007 14:07:41.999483 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea"} err="failed to get container status \"6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea\": rpc error: code = NotFound desc = could not find container \"6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea\": container with ID starting with 6a365f1db5eaca852052a331e13183b044dba2f497f740a233396b2fe66800ea not found: ID does not exist" Oct 07 14:07:42 crc kubenswrapper[4959]: I1007 14:07:42.820716 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0496464-69b6-47df-8658-25e65c392668" path="/var/lib/kubelet/pods/b0496464-69b6-47df-8658-25e65c392668/volumes" Oct 07 14:07:47 crc kubenswrapper[4959]: I1007 14:07:47.809538 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:47 crc kubenswrapper[4959]: E1007 14:07:47.810499 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:07:59 crc kubenswrapper[4959]: I1007 14:07:59.809822 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:07:59 crc kubenswrapper[4959]: E1007 14:07:59.810655 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.116933 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:08 crc kubenswrapper[4959]: E1007 14:08:08.117883 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="registry-server" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.117897 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="registry-server" Oct 07 14:08:08 crc kubenswrapper[4959]: E1007 14:08:08.117907 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="extract-utilities" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.117915 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="extract-utilities" Oct 07 14:08:08 crc kubenswrapper[4959]: E1007 14:08:08.117929 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="extract-content" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.117935 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="extract-content" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.118131 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0496464-69b6-47df-8658-25e65c392668" containerName="registry-server" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.119440 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.130515 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.223802 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.223939 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tsqz\" (UniqueName: \"kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.224018 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.325778 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.325832 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.325941 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tsqz\" (UniqueName: \"kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.326292 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.326696 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.345177 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tsqz\" (UniqueName: \"kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz\") pod \"redhat-operators-fsf66\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.443149 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:08 crc kubenswrapper[4959]: I1007 14:08:08.931493 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:09 crc kubenswrapper[4959]: I1007 14:08:09.142963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerStarted","Data":"78ab201084f5be88cde10546f04306944059b62015a53feaced2fe7f1a1777e4"} Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.153446 4959 generic.go:334] "Generic (PLEG): container finished" podID="11038622-628d-4ae0-a297-352e8ea496e1" containerID="e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792" exitCode=0 Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.153568 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerDied","Data":"e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792"} Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.529029 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.531877 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.538662 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.678239 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fq6\" (UniqueName: \"kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.678354 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.678474 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.781362 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fq6\" (UniqueName: \"kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.781527 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.781723 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.782158 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.782182 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.802620 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fq6\" (UniqueName: \"kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6\") pod \"redhat-marketplace-wqnxt\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:10 crc kubenswrapper[4959]: I1007 14:08:10.870715 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:11 crc kubenswrapper[4959]: I1007 14:08:11.407404 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:12 crc kubenswrapper[4959]: I1007 14:08:12.171515 4959 generic.go:334] "Generic (PLEG): container finished" podID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerID="b4e6d84dcb113f59ec4340d979bebc360210f081371902043fcc098d8594a4eb" exitCode=0 Oct 07 14:08:12 crc kubenswrapper[4959]: I1007 14:08:12.171683 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerDied","Data":"b4e6d84dcb113f59ec4340d979bebc360210f081371902043fcc098d8594a4eb"} Oct 07 14:08:12 crc kubenswrapper[4959]: I1007 14:08:12.171796 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerStarted","Data":"4cbb3561d6126b3095f2114783de023f1517d63848949e8aa58bcb44a944211d"} Oct 07 14:08:12 crc kubenswrapper[4959]: I1007 14:08:12.809075 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:08:13 crc kubenswrapper[4959]: I1007 14:08:13.181901 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c"} Oct 07 14:08:13 crc kubenswrapper[4959]: I1007 14:08:13.184163 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerStarted","Data":"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251"} Oct 07 14:08:14 crc kubenswrapper[4959]: I1007 14:08:14.192358 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerStarted","Data":"f31a735714b4a40753b90761af79cd5325b7f251cb508d26d878213d8bb629c3"} Oct 07 14:08:15 crc kubenswrapper[4959]: I1007 14:08:15.205121 4959 generic.go:334] "Generic (PLEG): container finished" podID="11038622-628d-4ae0-a297-352e8ea496e1" containerID="546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251" exitCode=0 Oct 07 14:08:15 crc kubenswrapper[4959]: I1007 14:08:15.205207 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerDied","Data":"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251"} Oct 07 14:08:15 crc kubenswrapper[4959]: I1007 14:08:15.209843 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:08:16 crc kubenswrapper[4959]: I1007 14:08:16.223944 4959 generic.go:334] "Generic (PLEG): container finished" podID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerID="f31a735714b4a40753b90761af79cd5325b7f251cb508d26d878213d8bb629c3" exitCode=0 Oct 07 14:08:16 crc kubenswrapper[4959]: I1007 14:08:16.223992 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerDied","Data":"f31a735714b4a40753b90761af79cd5325b7f251cb508d26d878213d8bb629c3"} Oct 07 14:08:17 crc kubenswrapper[4959]: I1007 14:08:17.252460 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerStarted","Data":"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b"} Oct 07 14:08:17 crc kubenswrapper[4959]: I1007 14:08:17.283781 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fsf66" podStartSLOduration=3.124730395 podStartE2EDuration="9.283753552s" podCreationTimestamp="2025-10-07 14:08:08 +0000 UTC" firstStartedPulling="2025-10-07 14:08:10.155483554 +0000 UTC m=+4042.316206231" lastFinishedPulling="2025-10-07 14:08:16.314506711 +0000 UTC m=+4048.475229388" observedRunningTime="2025-10-07 14:08:17.28055166 +0000 UTC m=+4049.441274347" watchObservedRunningTime="2025-10-07 14:08:17.283753552 +0000 UTC m=+4049.444476229" Oct 07 14:08:18 crc kubenswrapper[4959]: I1007 14:08:18.264716 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerStarted","Data":"8acb3e78c3a178d3bddc62f3181cc4a7f290339e04a3f4146229acfa562a38f3"} Oct 07 14:08:18 crc kubenswrapper[4959]: I1007 14:08:18.295337 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wqnxt" podStartSLOduration=3.122474457 podStartE2EDuration="8.295317227s" podCreationTimestamp="2025-10-07 14:08:10 +0000 UTC" firstStartedPulling="2025-10-07 14:08:12.173026604 +0000 UTC m=+4044.333749281" lastFinishedPulling="2025-10-07 14:08:17.345869374 +0000 UTC m=+4049.506592051" observedRunningTime="2025-10-07 14:08:18.287258956 +0000 UTC m=+4050.447981653" watchObservedRunningTime="2025-10-07 14:08:18.295317227 +0000 UTC m=+4050.456039904" Oct 07 14:08:18 crc kubenswrapper[4959]: I1007 14:08:18.443899 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:18 crc kubenswrapper[4959]: I1007 14:08:18.444028 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:19 crc kubenswrapper[4959]: I1007 14:08:19.498985 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fsf66" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="registry-server" probeResult="failure" output=< Oct 07 14:08:19 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 14:08:19 crc kubenswrapper[4959]: > Oct 07 14:08:20 crc kubenswrapper[4959]: I1007 14:08:20.870958 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:20 crc kubenswrapper[4959]: I1007 14:08:20.871463 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:21 crc kubenswrapper[4959]: I1007 14:08:21.921303 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-wqnxt" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="registry-server" probeResult="failure" output=< Oct 07 14:08:21 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 14:08:21 crc kubenswrapper[4959]: > Oct 07 14:08:28 crc kubenswrapper[4959]: I1007 14:08:28.494155 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:28 crc kubenswrapper[4959]: I1007 14:08:28.547295 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:28 crc kubenswrapper[4959]: I1007 14:08:28.750035 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:30 crc kubenswrapper[4959]: I1007 14:08:30.361352 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fsf66" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="registry-server" containerID="cri-o://0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b" gracePeriod=2 Oct 07 14:08:30 crc kubenswrapper[4959]: I1007 14:08:30.920816 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:30 crc kubenswrapper[4959]: I1007 14:08:30.987477 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.105422 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.237860 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities\") pod \"11038622-628d-4ae0-a297-352e8ea496e1\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.237957 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tsqz\" (UniqueName: \"kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz\") pod \"11038622-628d-4ae0-a297-352e8ea496e1\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.238015 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content\") pod \"11038622-628d-4ae0-a297-352e8ea496e1\" (UID: \"11038622-628d-4ae0-a297-352e8ea496e1\") " Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.240287 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities" (OuterVolumeSpecName: "utilities") pod "11038622-628d-4ae0-a297-352e8ea496e1" (UID: "11038622-628d-4ae0-a297-352e8ea496e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.244940 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz" (OuterVolumeSpecName: "kube-api-access-8tsqz") pod "11038622-628d-4ae0-a297-352e8ea496e1" (UID: "11038622-628d-4ae0-a297-352e8ea496e1"). InnerVolumeSpecName "kube-api-access-8tsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.324187 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11038622-628d-4ae0-a297-352e8ea496e1" (UID: "11038622-628d-4ae0-a297-352e8ea496e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.340742 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.340785 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tsqz\" (UniqueName: \"kubernetes.io/projected/11038622-628d-4ae0-a297-352e8ea496e1-kube-api-access-8tsqz\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.340797 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11038622-628d-4ae0-a297-352e8ea496e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.379353 4959 generic.go:334] "Generic (PLEG): container finished" podID="11038622-628d-4ae0-a297-352e8ea496e1" containerID="0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b" exitCode=0 Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.380164 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsf66" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.380160 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerDied","Data":"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b"} Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.380309 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsf66" event={"ID":"11038622-628d-4ae0-a297-352e8ea496e1","Type":"ContainerDied","Data":"78ab201084f5be88cde10546f04306944059b62015a53feaced2fe7f1a1777e4"} Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.380322 4959 scope.go:117] "RemoveContainer" containerID="0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.410824 4959 scope.go:117] "RemoveContainer" containerID="546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.517163 4959 scope.go:117] "RemoveContainer" containerID="e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.518976 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.528493 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fsf66"] Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.566161 4959 scope.go:117] "RemoveContainer" containerID="0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b" Oct 07 14:08:31 crc kubenswrapper[4959]: E1007 14:08:31.566905 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b\": container with ID starting with 0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b not found: ID does not exist" containerID="0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.566962 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b"} err="failed to get container status \"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b\": rpc error: code = NotFound desc = could not find container \"0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b\": container with ID starting with 0a1e2c54e795058447131dcc41f9ea3127497ee00fdb1477b7f53951e1240b0b not found: ID does not exist" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.566994 4959 scope.go:117] "RemoveContainer" containerID="546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251" Oct 07 14:08:31 crc kubenswrapper[4959]: E1007 14:08:31.567714 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251\": container with ID starting with 546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251 not found: ID does not exist" containerID="546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.567762 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251"} err="failed to get container status \"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251\": rpc error: code = NotFound desc = could not find container \"546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251\": container with ID starting with 546c6c11a63994bc7e7d8f810ae6f828e65b3c0cbf301bc740ee33a05e336251 not found: ID does not exist" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.567790 4959 scope.go:117] "RemoveContainer" containerID="e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792" Oct 07 14:08:31 crc kubenswrapper[4959]: E1007 14:08:31.568076 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792\": container with ID starting with e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792 not found: ID does not exist" containerID="e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792" Oct 07 14:08:31 crc kubenswrapper[4959]: I1007 14:08:31.568109 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792"} err="failed to get container status \"e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792\": rpc error: code = NotFound desc = could not find container \"e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792\": container with ID starting with e462863ab2f6a0112b8c469a1c4a63251ee3cfc7d6b2943485ca2bab06250792 not found: ID does not exist" Oct 07 14:08:32 crc kubenswrapper[4959]: I1007 14:08:32.822005 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11038622-628d-4ae0-a297-352e8ea496e1" path="/var/lib/kubelet/pods/11038622-628d-4ae0-a297-352e8ea496e1/volumes" Oct 07 14:08:32 crc kubenswrapper[4959]: I1007 14:08:32.935492 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:32 crc kubenswrapper[4959]: I1007 14:08:32.935791 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wqnxt" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="registry-server" containerID="cri-o://8acb3e78c3a178d3bddc62f3181cc4a7f290339e04a3f4146229acfa562a38f3" gracePeriod=2 Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.404847 4959 generic.go:334] "Generic (PLEG): container finished" podID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerID="8acb3e78c3a178d3bddc62f3181cc4a7f290339e04a3f4146229acfa562a38f3" exitCode=0 Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.404885 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerDied","Data":"8acb3e78c3a178d3bddc62f3181cc4a7f290339e04a3f4146229acfa562a38f3"} Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.605999 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.707720 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fq6\" (UniqueName: \"kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6\") pod \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.707819 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities\") pod \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.707960 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content\") pod \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\" (UID: \"0fa26bfd-4429-49f5-a05e-aff4231abcc8\") " Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.708609 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities" (OuterVolumeSpecName: "utilities") pod "0fa26bfd-4429-49f5-a05e-aff4231abcc8" (UID: "0fa26bfd-4429-49f5-a05e-aff4231abcc8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.726298 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fa26bfd-4429-49f5-a05e-aff4231abcc8" (UID: "0fa26bfd-4429-49f5-a05e-aff4231abcc8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.731892 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6" (OuterVolumeSpecName: "kube-api-access-r8fq6") pod "0fa26bfd-4429-49f5-a05e-aff4231abcc8" (UID: "0fa26bfd-4429-49f5-a05e-aff4231abcc8"). InnerVolumeSpecName "kube-api-access-r8fq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.810723 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fq6\" (UniqueName: \"kubernetes.io/projected/0fa26bfd-4429-49f5-a05e-aff4231abcc8-kube-api-access-r8fq6\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.810764 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:33 crc kubenswrapper[4959]: I1007 14:08:33.810778 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa26bfd-4429-49f5-a05e-aff4231abcc8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.416123 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqnxt" event={"ID":"0fa26bfd-4429-49f5-a05e-aff4231abcc8","Type":"ContainerDied","Data":"4cbb3561d6126b3095f2114783de023f1517d63848949e8aa58bcb44a944211d"} Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.416482 4959 scope.go:117] "RemoveContainer" containerID="8acb3e78c3a178d3bddc62f3181cc4a7f290339e04a3f4146229acfa562a38f3" Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.416583 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqnxt" Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.448252 4959 scope.go:117] "RemoveContainer" containerID="f31a735714b4a40753b90761af79cd5325b7f251cb508d26d878213d8bb629c3" Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.454499 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.464487 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqnxt"] Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.472474 4959 scope.go:117] "RemoveContainer" containerID="b4e6d84dcb113f59ec4340d979bebc360210f081371902043fcc098d8594a4eb" Oct 07 14:08:34 crc kubenswrapper[4959]: I1007 14:08:34.822869 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" path="/var/lib/kubelet/pods/0fa26bfd-4429-49f5-a05e-aff4231abcc8/volumes" Oct 07 14:10:37 crc kubenswrapper[4959]: I1007 14:10:37.695088 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:10:37 crc kubenswrapper[4959]: I1007 14:10:37.696405 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:11:07 crc kubenswrapper[4959]: I1007 14:11:07.695363 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:11:07 crc kubenswrapper[4959]: I1007 14:11:07.696123 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.695590 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.696903 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.697025 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.697825 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.697959 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c" gracePeriod=600 Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.994250 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c" exitCode=0 Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.994325 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c"} Oct 07 14:11:37 crc kubenswrapper[4959]: I1007 14:11:37.994680 4959 scope.go:117] "RemoveContainer" containerID="2cdf89fa7ee313fe0593dafb9df91c3562bd70cd31707eb78111b4fcacf74d37" Oct 07 14:11:39 crc kubenswrapper[4959]: I1007 14:11:39.005138 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e"} Oct 07 14:14:07 crc kubenswrapper[4959]: I1007 14:14:07.696209 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:14:07 crc kubenswrapper[4959]: I1007 14:14:07.696742 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:14:37 crc kubenswrapper[4959]: I1007 14:14:37.695170 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:14:37 crc kubenswrapper[4959]: I1007 14:14:37.695782 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.141725 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw"] Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.142889 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.142910 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.142921 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.142929 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.142972 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.142984 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.143011 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.143019 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.143075 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.143085 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4959]: E1007 14:15:00.143132 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.143142 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.143473 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa26bfd-4429-49f5-a05e-aff4231abcc8" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.143498 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="11038622-628d-4ae0-a297-352e8ea496e1" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.144284 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.147127 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.147228 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.153286 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw"] Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.223159 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tgmf\" (UniqueName: \"kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.223559 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.223752 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.325712 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgmf\" (UniqueName: \"kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.325815 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.325969 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.327056 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.368408 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.368849 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tgmf\" (UniqueName: \"kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf\") pod \"collect-profiles-29330775-2f9jw\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:00 crc kubenswrapper[4959]: I1007 14:15:00.478408 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:01 crc kubenswrapper[4959]: I1007 14:15:00.999868 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw"] Oct 07 14:15:01 crc kubenswrapper[4959]: I1007 14:15:01.737738 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" event={"ID":"1178c1d0-6adf-4cda-9dc7-927ca47f1659","Type":"ContainerStarted","Data":"63d4165986cf720d219fca4adc1de02bec19865d1c157e16050aa812ecc6d795"} Oct 07 14:15:01 crc kubenswrapper[4959]: I1007 14:15:01.738138 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" event={"ID":"1178c1d0-6adf-4cda-9dc7-927ca47f1659","Type":"ContainerStarted","Data":"b28b834491fe6b5db9d83192cdf9aac2368d1f5929e1d64b27ead61126963a75"} Oct 07 14:15:01 crc kubenswrapper[4959]: I1007 14:15:01.756050 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" podStartSLOduration=1.756032024 podStartE2EDuration="1.756032024s" podCreationTimestamp="2025-10-07 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:01.750015461 +0000 UTC m=+4453.910738148" watchObservedRunningTime="2025-10-07 14:15:01.756032024 +0000 UTC m=+4453.916754701" Oct 07 14:15:02 crc kubenswrapper[4959]: I1007 14:15:02.748245 4959 generic.go:334] "Generic (PLEG): container finished" podID="1178c1d0-6adf-4cda-9dc7-927ca47f1659" containerID="63d4165986cf720d219fca4adc1de02bec19865d1c157e16050aa812ecc6d795" exitCode=0 Oct 07 14:15:02 crc kubenswrapper[4959]: I1007 14:15:02.748299 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" event={"ID":"1178c1d0-6adf-4cda-9dc7-927ca47f1659","Type":"ContainerDied","Data":"63d4165986cf720d219fca4adc1de02bec19865d1c157e16050aa812ecc6d795"} Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.623772 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.708800 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tgmf\" (UniqueName: \"kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf\") pod \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.708922 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume\") pod \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.709030 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume\") pod \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\" (UID: \"1178c1d0-6adf-4cda-9dc7-927ca47f1659\") " Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.710055 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume" (OuterVolumeSpecName: "config-volume") pod "1178c1d0-6adf-4cda-9dc7-927ca47f1659" (UID: "1178c1d0-6adf-4cda-9dc7-927ca47f1659"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.719898 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1178c1d0-6adf-4cda-9dc7-927ca47f1659" (UID: "1178c1d0-6adf-4cda-9dc7-927ca47f1659"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.721256 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf" (OuterVolumeSpecName: "kube-api-access-5tgmf") pod "1178c1d0-6adf-4cda-9dc7-927ca47f1659" (UID: "1178c1d0-6adf-4cda-9dc7-927ca47f1659"). InnerVolumeSpecName "kube-api-access-5tgmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.765226 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" event={"ID":"1178c1d0-6adf-4cda-9dc7-927ca47f1659","Type":"ContainerDied","Data":"b28b834491fe6b5db9d83192cdf9aac2368d1f5929e1d64b27ead61126963a75"} Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.765277 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b28b834491fe6b5db9d83192cdf9aac2368d1f5929e1d64b27ead61126963a75" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.765251 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.812438 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1178c1d0-6adf-4cda-9dc7-927ca47f1659-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.812491 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tgmf\" (UniqueName: \"kubernetes.io/projected/1178c1d0-6adf-4cda-9dc7-927ca47f1659-kube-api-access-5tgmf\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.812502 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1178c1d0-6adf-4cda-9dc7-927ca47f1659-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.823798 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt"] Oct 07 14:15:04 crc kubenswrapper[4959]: I1007 14:15:04.830994 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-sh2jt"] Oct 07 14:15:06 crc kubenswrapper[4959]: I1007 14:15:06.841135 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931b10c1-afe8-4537-8dee-4581dfd7ae27" path="/var/lib/kubelet/pods/931b10c1-afe8-4537-8dee-4581dfd7ae27/volumes" Oct 07 14:15:07 crc kubenswrapper[4959]: I1007 14:15:07.695319 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:15:07 crc kubenswrapper[4959]: I1007 14:15:07.695375 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:15:07 crc kubenswrapper[4959]: I1007 14:15:07.695416 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:15:07 crc kubenswrapper[4959]: I1007 14:15:07.696162 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:15:07 crc kubenswrapper[4959]: I1007 14:15:07.696230 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" gracePeriod=600 Oct 07 14:15:07 crc kubenswrapper[4959]: E1007 14:15:07.819934 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:15:08 crc kubenswrapper[4959]: I1007 14:15:08.806845 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" exitCode=0 Oct 07 14:15:08 crc kubenswrapper[4959]: I1007 14:15:08.806902 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e"} Oct 07 14:15:08 crc kubenswrapper[4959]: I1007 14:15:08.806943 4959 scope.go:117] "RemoveContainer" containerID="8db342c1eccd2ddb10f780bbd04720d9a58bca24ac7139df92d7010fec25c49c" Oct 07 14:15:08 crc kubenswrapper[4959]: I1007 14:15:08.808319 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:15:08 crc kubenswrapper[4959]: E1007 14:15:08.809441 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:15:22 crc kubenswrapper[4959]: I1007 14:15:22.809207 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:15:22 crc kubenswrapper[4959]: E1007 14:15:22.811074 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:15:34 crc kubenswrapper[4959]: I1007 14:15:34.808562 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:15:34 crc kubenswrapper[4959]: E1007 14:15:34.809510 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:15:49 crc kubenswrapper[4959]: I1007 14:15:49.809369 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:15:49 crc kubenswrapper[4959]: E1007 14:15:49.810288 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:15:55 crc kubenswrapper[4959]: I1007 14:15:55.413760 4959 scope.go:117] "RemoveContainer" containerID="dd12d9871942dfc0c11254ef6b6c80b499d8e17730395f2bfc9b362d50eec73d" Oct 07 14:16:01 crc kubenswrapper[4959]: I1007 14:16:01.809647 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:16:01 crc kubenswrapper[4959]: E1007 14:16:01.810525 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:16:12 crc kubenswrapper[4959]: I1007 14:16:12.809470 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:16:12 crc kubenswrapper[4959]: E1007 14:16:12.811540 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:16:23 crc kubenswrapper[4959]: I1007 14:16:23.809508 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:16:23 crc kubenswrapper[4959]: E1007 14:16:23.810154 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:16:38 crc kubenswrapper[4959]: I1007 14:16:38.818148 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:16:38 crc kubenswrapper[4959]: E1007 14:16:38.820382 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:16:51 crc kubenswrapper[4959]: I1007 14:16:51.808723 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:16:51 crc kubenswrapper[4959]: E1007 14:16:51.809416 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:17:02 crc kubenswrapper[4959]: I1007 14:17:02.809296 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:17:02 crc kubenswrapper[4959]: E1007 14:17:02.810144 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:17:15 crc kubenswrapper[4959]: I1007 14:17:15.808497 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:17:15 crc kubenswrapper[4959]: E1007 14:17:15.809573 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:17:27 crc kubenswrapper[4959]: I1007 14:17:27.809768 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:17:27 crc kubenswrapper[4959]: E1007 14:17:27.810774 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:17:28 crc kubenswrapper[4959]: I1007 14:17:28.928152 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:28 crc kubenswrapper[4959]: E1007 14:17:28.929010 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1178c1d0-6adf-4cda-9dc7-927ca47f1659" containerName="collect-profiles" Oct 07 14:17:28 crc kubenswrapper[4959]: I1007 14:17:28.929026 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1178c1d0-6adf-4cda-9dc7-927ca47f1659" containerName="collect-profiles" Oct 07 14:17:28 crc kubenswrapper[4959]: I1007 14:17:28.929197 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1178c1d0-6adf-4cda-9dc7-927ca47f1659" containerName="collect-profiles" Oct 07 14:17:28 crc kubenswrapper[4959]: I1007 14:17:28.930514 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:28 crc kubenswrapper[4959]: I1007 14:17:28.947867 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.065177 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.065244 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592nf\" (UniqueName: \"kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.065536 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.167704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-592nf\" (UniqueName: \"kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.167996 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.168125 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.168653 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.168827 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.195737 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-592nf\" (UniqueName: \"kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf\") pod \"certified-operators-bwqvl\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.254974 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.806566 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:29 crc kubenswrapper[4959]: I1007 14:17:29.975402 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerStarted","Data":"c2474154f89829b4424330134726730a62b46439ebc2201a539544d67f424f5f"} Oct 07 14:17:30 crc kubenswrapper[4959]: I1007 14:17:30.987253 4959 generic.go:334] "Generic (PLEG): container finished" podID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerID="3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8" exitCode=0 Oct 07 14:17:30 crc kubenswrapper[4959]: I1007 14:17:30.987518 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerDied","Data":"3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8"} Oct 07 14:17:30 crc kubenswrapper[4959]: I1007 14:17:30.989770 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:17:33 crc kubenswrapper[4959]: I1007 14:17:33.007994 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerStarted","Data":"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7"} Oct 07 14:17:35 crc kubenswrapper[4959]: I1007 14:17:35.025457 4959 generic.go:334] "Generic (PLEG): container finished" podID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerID="f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7" exitCode=0 Oct 07 14:17:35 crc kubenswrapper[4959]: I1007 14:17:35.025551 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerDied","Data":"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7"} Oct 07 14:17:36 crc kubenswrapper[4959]: I1007 14:17:36.039922 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerStarted","Data":"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec"} Oct 07 14:17:36 crc kubenswrapper[4959]: I1007 14:17:36.072274 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwqvl" podStartSLOduration=3.587297646 podStartE2EDuration="8.072256244s" podCreationTimestamp="2025-10-07 14:17:28 +0000 UTC" firstStartedPulling="2025-10-07 14:17:30.989516962 +0000 UTC m=+4603.150239639" lastFinishedPulling="2025-10-07 14:17:35.47447556 +0000 UTC m=+4607.635198237" observedRunningTime="2025-10-07 14:17:36.062089282 +0000 UTC m=+4608.222811979" watchObservedRunningTime="2025-10-07 14:17:36.072256244 +0000 UTC m=+4608.232978921" Oct 07 14:17:39 crc kubenswrapper[4959]: I1007 14:17:39.255507 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:39 crc kubenswrapper[4959]: I1007 14:17:39.256162 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:39 crc kubenswrapper[4959]: I1007 14:17:39.301766 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:40 crc kubenswrapper[4959]: I1007 14:17:40.124550 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:40 crc kubenswrapper[4959]: I1007 14:17:40.170975 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:41 crc kubenswrapper[4959]: I1007 14:17:41.809221 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:17:41 crc kubenswrapper[4959]: E1007 14:17:41.809817 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:17:42 crc kubenswrapper[4959]: I1007 14:17:42.086127 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwqvl" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="registry-server" containerID="cri-o://8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec" gracePeriod=2 Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.003469 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.058754 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-592nf\" (UniqueName: \"kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf\") pod \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.058872 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content\") pod \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.058949 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities\") pod \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\" (UID: \"74412eb4-4e5a-4395-8f5e-b319a3bf5378\") " Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.060936 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities" (OuterVolumeSpecName: "utilities") pod "74412eb4-4e5a-4395-8f5e-b319a3bf5378" (UID: "74412eb4-4e5a-4395-8f5e-b319a3bf5378"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.067133 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf" (OuterVolumeSpecName: "kube-api-access-592nf") pod "74412eb4-4e5a-4395-8f5e-b319a3bf5378" (UID: "74412eb4-4e5a-4395-8f5e-b319a3bf5378"). InnerVolumeSpecName "kube-api-access-592nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.099012 4959 generic.go:334] "Generic (PLEG): container finished" podID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerID="8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec" exitCode=0 Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.099069 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwqvl" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.099075 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerDied","Data":"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec"} Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.099171 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwqvl" event={"ID":"74412eb4-4e5a-4395-8f5e-b319a3bf5378","Type":"ContainerDied","Data":"c2474154f89829b4424330134726730a62b46439ebc2201a539544d67f424f5f"} Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.099202 4959 scope.go:117] "RemoveContainer" containerID="8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.109328 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74412eb4-4e5a-4395-8f5e-b319a3bf5378" (UID: "74412eb4-4e5a-4395-8f5e-b319a3bf5378"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.125574 4959 scope.go:117] "RemoveContainer" containerID="f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.146151 4959 scope.go:117] "RemoveContainer" containerID="3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.161328 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.161360 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-592nf\" (UniqueName: \"kubernetes.io/projected/74412eb4-4e5a-4395-8f5e-b319a3bf5378-kube-api-access-592nf\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.161369 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74412eb4-4e5a-4395-8f5e-b319a3bf5378-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.209538 4959 scope.go:117] "RemoveContainer" containerID="8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec" Oct 07 14:17:43 crc kubenswrapper[4959]: E1007 14:17:43.210082 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec\": container with ID starting with 8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec not found: ID does not exist" containerID="8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.210122 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec"} err="failed to get container status \"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec\": rpc error: code = NotFound desc = could not find container \"8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec\": container with ID starting with 8959987b2f501abc7a1620c941275fca0a20eb82590729aab88428f64e140aec not found: ID does not exist" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.210147 4959 scope.go:117] "RemoveContainer" containerID="f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7" Oct 07 14:17:43 crc kubenswrapper[4959]: E1007 14:17:43.210522 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7\": container with ID starting with f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7 not found: ID does not exist" containerID="f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.210545 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7"} err="failed to get container status \"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7\": rpc error: code = NotFound desc = could not find container \"f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7\": container with ID starting with f8cfac7df45732bde08e4c1fa28eb916b0967d207bc94bd80e21f700327178b7 not found: ID does not exist" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.210568 4959 scope.go:117] "RemoveContainer" containerID="3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8" Oct 07 14:17:43 crc kubenswrapper[4959]: E1007 14:17:43.211182 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8\": container with ID starting with 3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8 not found: ID does not exist" containerID="3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.211213 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8"} err="failed to get container status \"3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8\": rpc error: code = NotFound desc = could not find container \"3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8\": container with ID starting with 3dd1345d70b6ae83e47511835fddea55f6fb7f6720fcb9ae083438fc458a7ca8 not found: ID does not exist" Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.439965 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:43 crc kubenswrapper[4959]: I1007 14:17:43.451542 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwqvl"] Oct 07 14:17:44 crc kubenswrapper[4959]: I1007 14:17:44.824806 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" path="/var/lib/kubelet/pods/74412eb4-4e5a-4395-8f5e-b319a3bf5378/volumes" Oct 07 14:17:56 crc kubenswrapper[4959]: I1007 14:17:56.809076 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:17:56 crc kubenswrapper[4959]: E1007 14:17:56.810086 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:08 crc kubenswrapper[4959]: I1007 14:18:08.817039 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:18:08 crc kubenswrapper[4959]: E1007 14:18:08.817905 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:22 crc kubenswrapper[4959]: I1007 14:18:22.808642 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:18:22 crc kubenswrapper[4959]: E1007 14:18:22.809560 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:34 crc kubenswrapper[4959]: I1007 14:18:34.808953 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:18:34 crc kubenswrapper[4959]: E1007 14:18:34.809754 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.559653 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:45 crc kubenswrapper[4959]: E1007 14:18:45.560540 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="registry-server" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.560557 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="registry-server" Oct 07 14:18:45 crc kubenswrapper[4959]: E1007 14:18:45.560586 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="extract-content" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.560594 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="extract-content" Oct 07 14:18:45 crc kubenswrapper[4959]: E1007 14:18:45.560644 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="extract-utilities" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.560654 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="extract-utilities" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.560863 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="74412eb4-4e5a-4395-8f5e-b319a3bf5378" containerName="registry-server" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.562356 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.568737 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.646138 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.646240 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.646349 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czffb\" (UniqueName: \"kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.748869 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.749418 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.749742 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czffb\" (UniqueName: \"kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.750249 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.750754 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.777350 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czffb\" (UniqueName: \"kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb\") pod \"redhat-marketplace-wcsjb\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:45 crc kubenswrapper[4959]: I1007 14:18:45.904186 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:46 crc kubenswrapper[4959]: I1007 14:18:46.361502 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:46 crc kubenswrapper[4959]: I1007 14:18:46.676072 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerStarted","Data":"095752a4f7f6bf336b20c1269cb773c4821e5b011f2e176a1fb4975b32fe0aa2"} Oct 07 14:18:47 crc kubenswrapper[4959]: I1007 14:18:47.686755 4959 generic.go:334] "Generic (PLEG): container finished" podID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerID="53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764" exitCode=0 Oct 07 14:18:47 crc kubenswrapper[4959]: I1007 14:18:47.686819 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerDied","Data":"53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764"} Oct 07 14:18:47 crc kubenswrapper[4959]: I1007 14:18:47.808119 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:18:47 crc kubenswrapper[4959]: E1007 14:18:47.808723 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:49 crc kubenswrapper[4959]: I1007 14:18:49.728445 4959 generic.go:334] "Generic (PLEG): container finished" podID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerID="2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9" exitCode=0 Oct 07 14:18:49 crc kubenswrapper[4959]: I1007 14:18:49.728527 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerDied","Data":"2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9"} Oct 07 14:18:49 crc kubenswrapper[4959]: E1007 14:18:49.834548 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:18:50 crc kubenswrapper[4959]: I1007 14:18:50.741867 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerStarted","Data":"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f"} Oct 07 14:18:50 crc kubenswrapper[4959]: I1007 14:18:50.761811 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcsjb" podStartSLOduration=3.229999732 podStartE2EDuration="5.761794078s" podCreationTimestamp="2025-10-07 14:18:45 +0000 UTC" firstStartedPulling="2025-10-07 14:18:47.68917744 +0000 UTC m=+4679.849900117" lastFinishedPulling="2025-10-07 14:18:50.220971786 +0000 UTC m=+4682.381694463" observedRunningTime="2025-10-07 14:18:50.761458578 +0000 UTC m=+4682.922181255" watchObservedRunningTime="2025-10-07 14:18:50.761794078 +0000 UTC m=+4682.922516775" Oct 07 14:18:55 crc kubenswrapper[4959]: I1007 14:18:55.905128 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:55 crc kubenswrapper[4959]: I1007 14:18:55.905809 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:55 crc kubenswrapper[4959]: I1007 14:18:55.958386 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:56 crc kubenswrapper[4959]: I1007 14:18:56.850397 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:56 crc kubenswrapper[4959]: I1007 14:18:56.917339 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:58 crc kubenswrapper[4959]: I1007 14:18:58.826804 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcsjb" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="registry-server" containerID="cri-o://099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f" gracePeriod=2 Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.488453 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.592669 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities\") pod \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.592914 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czffb\" (UniqueName: \"kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb\") pod \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.593025 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content\") pod \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\" (UID: \"437ae3d1-9c29-4e15-b7ac-44a0611440e0\") " Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.594849 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities" (OuterVolumeSpecName: "utilities") pod "437ae3d1-9c29-4e15-b7ac-44a0611440e0" (UID: "437ae3d1-9c29-4e15-b7ac-44a0611440e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.606591 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb" (OuterVolumeSpecName: "kube-api-access-czffb") pod "437ae3d1-9c29-4e15-b7ac-44a0611440e0" (UID: "437ae3d1-9c29-4e15-b7ac-44a0611440e0"). InnerVolumeSpecName "kube-api-access-czffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.608617 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "437ae3d1-9c29-4e15-b7ac-44a0611440e0" (UID: "437ae3d1-9c29-4e15-b7ac-44a0611440e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.695047 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.695083 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czffb\" (UniqueName: \"kubernetes.io/projected/437ae3d1-9c29-4e15-b7ac-44a0611440e0-kube-api-access-czffb\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.695093 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437ae3d1-9c29-4e15-b7ac-44a0611440e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.809016 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:18:59 crc kubenswrapper[4959]: E1007 14:18:59.809262 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.839345 4959 generic.go:334] "Generic (PLEG): container finished" podID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerID="099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f" exitCode=0 Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.839404 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerDied","Data":"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f"} Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.839431 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcsjb" event={"ID":"437ae3d1-9c29-4e15-b7ac-44a0611440e0","Type":"ContainerDied","Data":"095752a4f7f6bf336b20c1269cb773c4821e5b011f2e176a1fb4975b32fe0aa2"} Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.839463 4959 scope.go:117] "RemoveContainer" containerID="099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.839462 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcsjb" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.889054 4959 scope.go:117] "RemoveContainer" containerID="2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.901601 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.910173 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcsjb"] Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.919985 4959 scope.go:117] "RemoveContainer" containerID="53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.971336 4959 scope.go:117] "RemoveContainer" containerID="099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f" Oct 07 14:18:59 crc kubenswrapper[4959]: E1007 14:18:59.971822 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f\": container with ID starting with 099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f not found: ID does not exist" containerID="099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.971867 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f"} err="failed to get container status \"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f\": rpc error: code = NotFound desc = could not find container \"099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f\": container with ID starting with 099b2bd2528f04708cfa7c03b98c518a1004a3a7ac2e792b0c6f460945294d9f not found: ID does not exist" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.971894 4959 scope.go:117] "RemoveContainer" containerID="2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9" Oct 07 14:18:59 crc kubenswrapper[4959]: E1007 14:18:59.972339 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9\": container with ID starting with 2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9 not found: ID does not exist" containerID="2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.972362 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9"} err="failed to get container status \"2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9\": rpc error: code = NotFound desc = could not find container \"2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9\": container with ID starting with 2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9 not found: ID does not exist" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.972376 4959 scope.go:117] "RemoveContainer" containerID="53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764" Oct 07 14:18:59 crc kubenswrapper[4959]: E1007 14:18:59.972583 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764\": container with ID starting with 53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764 not found: ID does not exist" containerID="53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764" Oct 07 14:18:59 crc kubenswrapper[4959]: I1007 14:18:59.972605 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764"} err="failed to get container status \"53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764\": rpc error: code = NotFound desc = could not find container \"53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764\": container with ID starting with 53824a91f77ce8ea42c0fdb6fbdd9a23efa280d2650b9b7c51c91b337ad6b764 not found: ID does not exist" Oct 07 14:19:00 crc kubenswrapper[4959]: E1007 14:19:00.136549 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:00 crc kubenswrapper[4959]: I1007 14:19:00.819597 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" path="/var/lib/kubelet/pods/437ae3d1-9c29-4e15-b7ac-44a0611440e0/volumes" Oct 07 14:19:10 crc kubenswrapper[4959]: E1007 14:19:10.385444 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:13 crc kubenswrapper[4959]: I1007 14:19:13.810247 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:19:13 crc kubenswrapper[4959]: E1007 14:19:13.811751 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.937846 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:14 crc kubenswrapper[4959]: E1007 14:19:14.938579 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="registry-server" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.938594 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="registry-server" Oct 07 14:19:14 crc kubenswrapper[4959]: E1007 14:19:14.938617 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="extract-utilities" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.938640 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="extract-utilities" Oct 07 14:19:14 crc kubenswrapper[4959]: E1007 14:19:14.938654 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="extract-content" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.938661 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="extract-content" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.938880 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="437ae3d1-9c29-4e15-b7ac-44a0611440e0" containerName="registry-server" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.940650 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:14 crc kubenswrapper[4959]: I1007 14:19:14.950849 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.017919 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.018019 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9gc\" (UniqueName: \"kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.018205 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.119666 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9gc\" (UniqueName: \"kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.119785 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.119918 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.120416 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.120448 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.311808 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9gc\" (UniqueName: \"kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc\") pod \"redhat-operators-lc2sp\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:15 crc kubenswrapper[4959]: I1007 14:19:15.562699 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:16 crc kubenswrapper[4959]: I1007 14:19:16.085592 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:16 crc kubenswrapper[4959]: I1007 14:19:16.985863 4959 generic.go:334] "Generic (PLEG): container finished" podID="9c77fa8d-043d-4870-bb59-27ef59183093" containerID="d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851" exitCode=0 Oct 07 14:19:16 crc kubenswrapper[4959]: I1007 14:19:16.985936 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerDied","Data":"d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851"} Oct 07 14:19:16 crc kubenswrapper[4959]: I1007 14:19:16.986159 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerStarted","Data":"10fd0a8efb6edbea642970d4a03b85f4aee1e1a77c5d0400bb7a01b54c48a80b"} Oct 07 14:19:19 crc kubenswrapper[4959]: I1007 14:19:19.016420 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerStarted","Data":"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04"} Oct 07 14:19:20 crc kubenswrapper[4959]: I1007 14:19:20.029275 4959 generic.go:334] "Generic (PLEG): container finished" podID="9c77fa8d-043d-4870-bb59-27ef59183093" containerID="fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04" exitCode=0 Oct 07 14:19:20 crc kubenswrapper[4959]: I1007 14:19:20.029368 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerDied","Data":"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04"} Oct 07 14:19:20 crc kubenswrapper[4959]: E1007 14:19:20.661859 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:21 crc kubenswrapper[4959]: I1007 14:19:21.041336 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerStarted","Data":"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380"} Oct 07 14:19:21 crc kubenswrapper[4959]: I1007 14:19:21.065367 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lc2sp" podStartSLOduration=3.501587069 podStartE2EDuration="7.065345514s" podCreationTimestamp="2025-10-07 14:19:14 +0000 UTC" firstStartedPulling="2025-10-07 14:19:16.987648158 +0000 UTC m=+4709.148370835" lastFinishedPulling="2025-10-07 14:19:20.551406593 +0000 UTC m=+4712.712129280" observedRunningTime="2025-10-07 14:19:21.056933643 +0000 UTC m=+4713.217656320" watchObservedRunningTime="2025-10-07 14:19:21.065345514 +0000 UTC m=+4713.226068191" Oct 07 14:19:25 crc kubenswrapper[4959]: I1007 14:19:25.562938 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:25 crc kubenswrapper[4959]: I1007 14:19:25.566159 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:25 crc kubenswrapper[4959]: I1007 14:19:25.623721 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:26 crc kubenswrapper[4959]: I1007 14:19:26.163532 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:26 crc kubenswrapper[4959]: I1007 14:19:26.213554 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:26 crc kubenswrapper[4959]: I1007 14:19:26.808982 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:19:26 crc kubenswrapper[4959]: E1007 14:19:26.809450 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.106859 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lc2sp" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="registry-server" containerID="cri-o://281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380" gracePeriod=2 Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.782565 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.907769 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9gc\" (UniqueName: \"kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc\") pod \"9c77fa8d-043d-4870-bb59-27ef59183093\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.907920 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content\") pod \"9c77fa8d-043d-4870-bb59-27ef59183093\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.907998 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities\") pod \"9c77fa8d-043d-4870-bb59-27ef59183093\" (UID: \"9c77fa8d-043d-4870-bb59-27ef59183093\") " Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.908944 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities" (OuterVolumeSpecName: "utilities") pod "9c77fa8d-043d-4870-bb59-27ef59183093" (UID: "9c77fa8d-043d-4870-bb59-27ef59183093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.918190 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc" (OuterVolumeSpecName: "kube-api-access-ch9gc") pod "9c77fa8d-043d-4870-bb59-27ef59183093" (UID: "9c77fa8d-043d-4870-bb59-27ef59183093"). InnerVolumeSpecName "kube-api-access-ch9gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:19:28 crc kubenswrapper[4959]: I1007 14:19:28.989171 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c77fa8d-043d-4870-bb59-27ef59183093" (UID: "9c77fa8d-043d-4870-bb59-27ef59183093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.011595 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.011650 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c77fa8d-043d-4870-bb59-27ef59183093-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.011665 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9gc\" (UniqueName: \"kubernetes.io/projected/9c77fa8d-043d-4870-bb59-27ef59183093-kube-api-access-ch9gc\") on node \"crc\" DevicePath \"\"" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.116140 4959 generic.go:334] "Generic (PLEG): container finished" podID="9c77fa8d-043d-4870-bb59-27ef59183093" containerID="281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380" exitCode=0 Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.116192 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerDied","Data":"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380"} Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.116208 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc2sp" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.117107 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc2sp" event={"ID":"9c77fa8d-043d-4870-bb59-27ef59183093","Type":"ContainerDied","Data":"10fd0a8efb6edbea642970d4a03b85f4aee1e1a77c5d0400bb7a01b54c48a80b"} Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.117114 4959 scope.go:117] "RemoveContainer" containerID="281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.141020 4959 scope.go:117] "RemoveContainer" containerID="fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.157259 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.161163 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lc2sp"] Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.161926 4959 scope.go:117] "RemoveContainer" containerID="d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.210570 4959 scope.go:117] "RemoveContainer" containerID="281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380" Oct 07 14:19:29 crc kubenswrapper[4959]: E1007 14:19:29.211187 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380\": container with ID starting with 281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380 not found: ID does not exist" containerID="281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.211231 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380"} err="failed to get container status \"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380\": rpc error: code = NotFound desc = could not find container \"281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380\": container with ID starting with 281cfd3d76f2c0cc5c4947af802eccf1ee9e6bacf589fd949c1328f21d7b2380 not found: ID does not exist" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.211251 4959 scope.go:117] "RemoveContainer" containerID="fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04" Oct 07 14:19:29 crc kubenswrapper[4959]: E1007 14:19:29.211734 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04\": container with ID starting with fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04 not found: ID does not exist" containerID="fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.211757 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04"} err="failed to get container status \"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04\": rpc error: code = NotFound desc = could not find container \"fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04\": container with ID starting with fac7450b25e245a0ae29410c7e9eae55022dd4dadf93a8741288d3af3260db04 not found: ID does not exist" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.211771 4959 scope.go:117] "RemoveContainer" containerID="d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851" Oct 07 14:19:29 crc kubenswrapper[4959]: E1007 14:19:29.212281 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851\": container with ID starting with d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851 not found: ID does not exist" containerID="d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851" Oct 07 14:19:29 crc kubenswrapper[4959]: I1007 14:19:29.212344 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851"} err="failed to get container status \"d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851\": rpc error: code = NotFound desc = could not find container \"d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851\": container with ID starting with d7c22aa3611714efe246a049be3a3dec64baf24c64103a9ab725e038c99a4851 not found: ID does not exist" Oct 07 14:19:30 crc kubenswrapper[4959]: I1007 14:19:30.819113 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" path="/var/lib/kubelet/pods/9c77fa8d-043d-4870-bb59-27ef59183093/volumes" Oct 07 14:19:30 crc kubenswrapper[4959]: E1007 14:19:30.906340 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:39 crc kubenswrapper[4959]: I1007 14:19:39.809363 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:19:39 crc kubenswrapper[4959]: E1007 14:19:39.810769 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:19:41 crc kubenswrapper[4959]: E1007 14:19:41.182443 4959 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-conmon-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ae3d1_9c29_4e15_b7ac_44a0611440e0.slice/crio-2543957d1bb9275dfc12a68a99c9541f109ce98b7e277afffeffcf19a740b8b9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:48 crc kubenswrapper[4959]: E1007 14:19:48.844924 4959 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/8b57bd0387484c59e1f9af11fd0e33916284cdebbb5350bc7e7c3e8f3b0ac190/diff" to get inode usage: stat /var/lib/containers/storage/overlay/8b57bd0387484c59e1f9af11fd0e33916284cdebbb5350bc7e7c3e8f3b0ac190/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_redhat-marketplace-wcsjb_437ae3d1-9c29-4e15-b7ac-44a0611440e0/extract-content/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_redhat-marketplace-wcsjb_437ae3d1-9c29-4e15-b7ac-44a0611440e0/extract-content/0.log: no such file or directory Oct 07 14:19:50 crc kubenswrapper[4959]: I1007 14:19:50.812538 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:19:50 crc kubenswrapper[4959]: E1007 14:19:50.813094 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:20:05 crc kubenswrapper[4959]: I1007 14:20:05.808555 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:20:05 crc kubenswrapper[4959]: E1007 14:20:05.810584 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:20:20 crc kubenswrapper[4959]: I1007 14:20:20.809722 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:20:21 crc kubenswrapper[4959]: I1007 14:20:21.566044 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6"} Oct 07 14:22:37 crc kubenswrapper[4959]: I1007 14:22:37.695300 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:22:37 crc kubenswrapper[4959]: I1007 14:22:37.695890 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:23:07 crc kubenswrapper[4959]: I1007 14:23:07.695446 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:23:07 crc kubenswrapper[4959]: I1007 14:23:07.695971 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:23:37 crc kubenswrapper[4959]: I1007 14:23:37.695659 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:23:37 crc kubenswrapper[4959]: I1007 14:23:37.696408 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:23:37 crc kubenswrapper[4959]: I1007 14:23:37.696453 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:23:37 crc kubenswrapper[4959]: I1007 14:23:37.697262 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:23:37 crc kubenswrapper[4959]: I1007 14:23:37.697383 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6" gracePeriod=600 Oct 07 14:23:38 crc kubenswrapper[4959]: I1007 14:23:38.172918 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6" exitCode=0 Oct 07 14:23:38 crc kubenswrapper[4959]: I1007 14:23:38.173001 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6"} Oct 07 14:23:38 crc kubenswrapper[4959]: I1007 14:23:38.173379 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e"} Oct 07 14:23:38 crc kubenswrapper[4959]: I1007 14:23:38.173397 4959 scope.go:117] "RemoveContainer" containerID="ee15c3215e7d19ece81bc68348a19517533c436a5da46ab6fe1dbfa8f1f5103e" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.411195 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:44 crc kubenswrapper[4959]: E1007 14:23:44.412171 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="registry-server" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.412188 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="registry-server" Oct 07 14:23:44 crc kubenswrapper[4959]: E1007 14:23:44.412205 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="extract-content" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.412211 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="extract-content" Oct 07 14:23:44 crc kubenswrapper[4959]: E1007 14:23:44.412235 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="extract-utilities" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.412243 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="extract-utilities" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.412489 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c77fa8d-043d-4870-bb59-27ef59183093" containerName="registry-server" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.414286 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.426753 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.546230 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.546299 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxq84\" (UniqueName: \"kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.546324 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.648024 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxq84\" (UniqueName: \"kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.648084 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.648282 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.648671 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.648718 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.666285 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxq84\" (UniqueName: \"kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84\") pod \"community-operators-fwc66\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:44 crc kubenswrapper[4959]: I1007 14:23:44.771567 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:45 crc kubenswrapper[4959]: I1007 14:23:45.302007 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:46 crc kubenswrapper[4959]: I1007 14:23:46.241367 4959 generic.go:334] "Generic (PLEG): container finished" podID="1019c016-ce69-4591-84d2-f5326e9b420c" containerID="4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae" exitCode=0 Oct 07 14:23:46 crc kubenswrapper[4959]: I1007 14:23:46.241435 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerDied","Data":"4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae"} Oct 07 14:23:46 crc kubenswrapper[4959]: I1007 14:23:46.242795 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerStarted","Data":"760f7ae639be62e62db7a0e50f65a232164ba9b23c96788d0296cca8cb16a225"} Oct 07 14:23:46 crc kubenswrapper[4959]: I1007 14:23:46.246451 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:23:47 crc kubenswrapper[4959]: I1007 14:23:47.254295 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerStarted","Data":"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98"} Oct 07 14:23:49 crc kubenswrapper[4959]: I1007 14:23:49.271403 4959 generic.go:334] "Generic (PLEG): container finished" podID="1019c016-ce69-4591-84d2-f5326e9b420c" containerID="aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98" exitCode=0 Oct 07 14:23:49 crc kubenswrapper[4959]: I1007 14:23:49.271585 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerDied","Data":"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98"} Oct 07 14:23:50 crc kubenswrapper[4959]: I1007 14:23:50.286235 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerStarted","Data":"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873"} Oct 07 14:23:50 crc kubenswrapper[4959]: I1007 14:23:50.314760 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fwc66" podStartSLOduration=2.503326011 podStartE2EDuration="6.314742165s" podCreationTimestamp="2025-10-07 14:23:44 +0000 UTC" firstStartedPulling="2025-10-07 14:23:46.246182155 +0000 UTC m=+4978.406904832" lastFinishedPulling="2025-10-07 14:23:50.057598309 +0000 UTC m=+4982.218320986" observedRunningTime="2025-10-07 14:23:50.302818533 +0000 UTC m=+4982.463541220" watchObservedRunningTime="2025-10-07 14:23:50.314742165 +0000 UTC m=+4982.475464832" Oct 07 14:23:54 crc kubenswrapper[4959]: I1007 14:23:54.772036 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:54 crc kubenswrapper[4959]: I1007 14:23:54.772591 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:54 crc kubenswrapper[4959]: I1007 14:23:54.819891 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:55 crc kubenswrapper[4959]: I1007 14:23:55.372667 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:55 crc kubenswrapper[4959]: I1007 14:23:55.428807 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:57 crc kubenswrapper[4959]: I1007 14:23:57.340485 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fwc66" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="registry-server" containerID="cri-o://f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873" gracePeriod=2 Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.038649 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.121067 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxq84\" (UniqueName: \"kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84\") pod \"1019c016-ce69-4591-84d2-f5326e9b420c\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.121230 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities\") pod \"1019c016-ce69-4591-84d2-f5326e9b420c\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.121271 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content\") pod \"1019c016-ce69-4591-84d2-f5326e9b420c\" (UID: \"1019c016-ce69-4591-84d2-f5326e9b420c\") " Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.122456 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities" (OuterVolumeSpecName: "utilities") pod "1019c016-ce69-4591-84d2-f5326e9b420c" (UID: "1019c016-ce69-4591-84d2-f5326e9b420c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.127193 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84" (OuterVolumeSpecName: "kube-api-access-qxq84") pod "1019c016-ce69-4591-84d2-f5326e9b420c" (UID: "1019c016-ce69-4591-84d2-f5326e9b420c"). InnerVolumeSpecName "kube-api-access-qxq84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.188866 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1019c016-ce69-4591-84d2-f5326e9b420c" (UID: "1019c016-ce69-4591-84d2-f5326e9b420c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.224150 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxq84\" (UniqueName: \"kubernetes.io/projected/1019c016-ce69-4591-84d2-f5326e9b420c-kube-api-access-qxq84\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.224186 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.224196 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1019c016-ce69-4591-84d2-f5326e9b420c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.352847 4959 generic.go:334] "Generic (PLEG): container finished" podID="1019c016-ce69-4591-84d2-f5326e9b420c" containerID="f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873" exitCode=0 Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.352918 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerDied","Data":"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873"} Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.352969 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwc66" event={"ID":"1019c016-ce69-4591-84d2-f5326e9b420c","Type":"ContainerDied","Data":"760f7ae639be62e62db7a0e50f65a232164ba9b23c96788d0296cca8cb16a225"} Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.352992 4959 scope.go:117] "RemoveContainer" containerID="f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.353167 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwc66" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.389695 4959 scope.go:117] "RemoveContainer" containerID="aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.410079 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.416078 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fwc66"] Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.421068 4959 scope.go:117] "RemoveContainer" containerID="4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.468511 4959 scope.go:117] "RemoveContainer" containerID="f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873" Oct 07 14:23:58 crc kubenswrapper[4959]: E1007 14:23:58.468974 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873\": container with ID starting with f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873 not found: ID does not exist" containerID="f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.469014 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873"} err="failed to get container status \"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873\": rpc error: code = NotFound desc = could not find container \"f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873\": container with ID starting with f1a254e43bc2123e618c685fa3e13aea1c5f954aa991a7cd7886a1f50e714873 not found: ID does not exist" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.469042 4959 scope.go:117] "RemoveContainer" containerID="aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98" Oct 07 14:23:58 crc kubenswrapper[4959]: E1007 14:23:58.469389 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98\": container with ID starting with aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98 not found: ID does not exist" containerID="aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.469450 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98"} err="failed to get container status \"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98\": rpc error: code = NotFound desc = could not find container \"aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98\": container with ID starting with aa7d57e8069f50263f899112c75e631e7ee901de0063dcd4aacd972602419c98 not found: ID does not exist" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.469477 4959 scope.go:117] "RemoveContainer" containerID="4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae" Oct 07 14:23:58 crc kubenswrapper[4959]: E1007 14:23:58.469835 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae\": container with ID starting with 4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae not found: ID does not exist" containerID="4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.469872 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae"} err="failed to get container status \"4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae\": rpc error: code = NotFound desc = could not find container \"4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae\": container with ID starting with 4d7b8aafe98b4faf4e33f08bc6e9e00143add8db73e21717992fe00559c0a9ae not found: ID does not exist" Oct 07 14:23:58 crc kubenswrapper[4959]: I1007 14:23:58.818389 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" path="/var/lib/kubelet/pods/1019c016-ce69-4591-84d2-f5326e9b420c/volumes" Oct 07 14:26:07 crc kubenswrapper[4959]: I1007 14:26:07.695334 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:26:07 crc kubenswrapper[4959]: I1007 14:26:07.695859 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:26:37 crc kubenswrapper[4959]: I1007 14:26:37.695634 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:26:37 crc kubenswrapper[4959]: I1007 14:26:37.696209 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:27:07 crc kubenswrapper[4959]: I1007 14:27:07.695730 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:27:07 crc kubenswrapper[4959]: I1007 14:27:07.696283 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:27:07 crc kubenswrapper[4959]: I1007 14:27:07.696334 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:27:07 crc kubenswrapper[4959]: I1007 14:27:07.697176 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:27:07 crc kubenswrapper[4959]: I1007 14:27:07.697242 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" gracePeriod=600 Oct 07 14:27:07 crc kubenswrapper[4959]: E1007 14:27:07.820715 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:27:08 crc kubenswrapper[4959]: I1007 14:27:08.020393 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" exitCode=0 Oct 07 14:27:08 crc kubenswrapper[4959]: I1007 14:27:08.020434 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e"} Oct 07 14:27:08 crc kubenswrapper[4959]: I1007 14:27:08.020474 4959 scope.go:117] "RemoveContainer" containerID="9c141e9fdd0b22147182937ac142d7077111b215067fe50ea9b2513ffe1a28d6" Oct 07 14:27:08 crc kubenswrapper[4959]: I1007 14:27:08.021216 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:27:08 crc kubenswrapper[4959]: E1007 14:27:08.021536 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:27:19 crc kubenswrapper[4959]: I1007 14:27:19.810143 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:27:19 crc kubenswrapper[4959]: E1007 14:27:19.810954 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:27:34 crc kubenswrapper[4959]: I1007 14:27:34.809553 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:27:34 crc kubenswrapper[4959]: E1007 14:27:34.810452 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.921130 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:27:40 crc kubenswrapper[4959]: E1007 14:27:40.925043 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="extract-content" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.925248 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="extract-content" Oct 07 14:27:40 crc kubenswrapper[4959]: E1007 14:27:40.925355 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="registry-server" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.925509 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="registry-server" Oct 07 14:27:40 crc kubenswrapper[4959]: E1007 14:27:40.925591 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="extract-utilities" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.925675 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="extract-utilities" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.925998 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="1019c016-ce69-4591-84d2-f5326e9b420c" containerName="registry-server" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.927825 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:40 crc kubenswrapper[4959]: I1007 14:27:40.937871 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.015824 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgfd\" (UniqueName: \"kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.016083 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.016108 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.118468 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.118507 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.118675 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgfd\" (UniqueName: \"kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.119095 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.119112 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.143432 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgfd\" (UniqueName: \"kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd\") pod \"certified-operators-bmpmh\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.254088 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:41 crc kubenswrapper[4959]: I1007 14:27:41.829502 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:27:41 crc kubenswrapper[4959]: W1007 14:27:41.835976 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3a9b1e_82b9_4c89_be7e_b87629c239e0.slice/crio-e8762633e5b3d17daf906bb7ec0f63cce20433d96fef7acb0af8b8ac2ee7d980 WatchSource:0}: Error finding container e8762633e5b3d17daf906bb7ec0f63cce20433d96fef7acb0af8b8ac2ee7d980: Status 404 returned error can't find the container with id e8762633e5b3d17daf906bb7ec0f63cce20433d96fef7acb0af8b8ac2ee7d980 Oct 07 14:27:42 crc kubenswrapper[4959]: I1007 14:27:42.351288 4959 generic.go:334] "Generic (PLEG): container finished" podID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerID="75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b" exitCode=0 Oct 07 14:27:42 crc kubenswrapper[4959]: I1007 14:27:42.351344 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerDied","Data":"75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b"} Oct 07 14:27:42 crc kubenswrapper[4959]: I1007 14:27:42.351372 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerStarted","Data":"e8762633e5b3d17daf906bb7ec0f63cce20433d96fef7acb0af8b8ac2ee7d980"} Oct 07 14:27:44 crc kubenswrapper[4959]: I1007 14:27:44.377401 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerStarted","Data":"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6"} Oct 07 14:27:46 crc kubenswrapper[4959]: I1007 14:27:46.398299 4959 generic.go:334] "Generic (PLEG): container finished" podID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerID="0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6" exitCode=0 Oct 07 14:27:46 crc kubenswrapper[4959]: I1007 14:27:46.398431 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerDied","Data":"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6"} Oct 07 14:27:47 crc kubenswrapper[4959]: I1007 14:27:47.409370 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerStarted","Data":"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c"} Oct 07 14:27:47 crc kubenswrapper[4959]: I1007 14:27:47.435071 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmpmh" podStartSLOduration=2.9674674149999998 podStartE2EDuration="7.435050878s" podCreationTimestamp="2025-10-07 14:27:40 +0000 UTC" firstStartedPulling="2025-10-07 14:27:42.35302535 +0000 UTC m=+5214.513748027" lastFinishedPulling="2025-10-07 14:27:46.820608813 +0000 UTC m=+5218.981331490" observedRunningTime="2025-10-07 14:27:47.431059793 +0000 UTC m=+5219.591782480" watchObservedRunningTime="2025-10-07 14:27:47.435050878 +0000 UTC m=+5219.595773555" Oct 07 14:27:47 crc kubenswrapper[4959]: I1007 14:27:47.810317 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:27:47 crc kubenswrapper[4959]: E1007 14:27:47.811230 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:27:51 crc kubenswrapper[4959]: I1007 14:27:51.254466 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:51 crc kubenswrapper[4959]: I1007 14:27:51.254933 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:27:51 crc kubenswrapper[4959]: I1007 14:27:51.304496 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:28:01 crc kubenswrapper[4959]: I1007 14:28:01.368304 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:28:01 crc kubenswrapper[4959]: I1007 14:28:01.457325 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:28:01 crc kubenswrapper[4959]: I1007 14:28:01.527692 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmpmh" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="registry-server" containerID="cri-o://2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c" gracePeriod=2 Oct 07 14:28:01 crc kubenswrapper[4959]: I1007 14:28:01.808890 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:28:01 crc kubenswrapper[4959]: E1007 14:28:01.809243 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.103901 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.287982 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities\") pod \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.288149 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content\") pod \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.288218 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zgfd\" (UniqueName: \"kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd\") pod \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\" (UID: \"4e3a9b1e-82b9-4c89-be7e-b87629c239e0\") " Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.288838 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities" (OuterVolumeSpecName: "utilities") pod "4e3a9b1e-82b9-4c89-be7e-b87629c239e0" (UID: "4e3a9b1e-82b9-4c89-be7e-b87629c239e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.289041 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.293969 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd" (OuterVolumeSpecName: "kube-api-access-5zgfd") pod "4e3a9b1e-82b9-4c89-be7e-b87629c239e0" (UID: "4e3a9b1e-82b9-4c89-be7e-b87629c239e0"). InnerVolumeSpecName "kube-api-access-5zgfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.331683 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e3a9b1e-82b9-4c89-be7e-b87629c239e0" (UID: "4e3a9b1e-82b9-4c89-be7e-b87629c239e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.391350 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.391388 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zgfd\" (UniqueName: \"kubernetes.io/projected/4e3a9b1e-82b9-4c89-be7e-b87629c239e0-kube-api-access-5zgfd\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.537784 4959 generic.go:334] "Generic (PLEG): container finished" podID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerID="2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c" exitCode=0 Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.537824 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerDied","Data":"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c"} Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.537850 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmpmh" event={"ID":"4e3a9b1e-82b9-4c89-be7e-b87629c239e0","Type":"ContainerDied","Data":"e8762633e5b3d17daf906bb7ec0f63cce20433d96fef7acb0af8b8ac2ee7d980"} Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.537870 4959 scope.go:117] "RemoveContainer" containerID="2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.538025 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmpmh" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.577286 4959 scope.go:117] "RemoveContainer" containerID="0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.584617 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.594240 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bmpmh"] Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.610906 4959 scope.go:117] "RemoveContainer" containerID="75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.638452 4959 scope.go:117] "RemoveContainer" containerID="2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c" Oct 07 14:28:02 crc kubenswrapper[4959]: E1007 14:28:02.638964 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c\": container with ID starting with 2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c not found: ID does not exist" containerID="2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.639007 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c"} err="failed to get container status \"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c\": rpc error: code = NotFound desc = could not find container \"2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c\": container with ID starting with 2b965b4902cf5a7e6cff436b424ee81a8211ec07e8bb8b746b7ac33341d4ed6c not found: ID does not exist" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.639034 4959 scope.go:117] "RemoveContainer" containerID="0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6" Oct 07 14:28:02 crc kubenswrapper[4959]: E1007 14:28:02.639297 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6\": container with ID starting with 0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6 not found: ID does not exist" containerID="0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.639332 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6"} err="failed to get container status \"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6\": rpc error: code = NotFound desc = could not find container \"0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6\": container with ID starting with 0713077749e02c23e03f92d2b6fde85dcd39149b7895846f7f19fb3c8bc648c6 not found: ID does not exist" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.639354 4959 scope.go:117] "RemoveContainer" containerID="75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b" Oct 07 14:28:02 crc kubenswrapper[4959]: E1007 14:28:02.639562 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b\": container with ID starting with 75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b not found: ID does not exist" containerID="75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.639595 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b"} err="failed to get container status \"75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b\": rpc error: code = NotFound desc = could not find container \"75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b\": container with ID starting with 75b38265d3e3e9c090bcf122baf911ef39967fce8dcb30b836bbddc3d60f3b9b not found: ID does not exist" Oct 07 14:28:02 crc kubenswrapper[4959]: I1007 14:28:02.819374 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" path="/var/lib/kubelet/pods/4e3a9b1e-82b9-4c89-be7e-b87629c239e0/volumes" Oct 07 14:28:12 crc kubenswrapper[4959]: I1007 14:28:12.816756 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:28:12 crc kubenswrapper[4959]: E1007 14:28:12.820119 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:28:26 crc kubenswrapper[4959]: I1007 14:28:26.808938 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:28:26 crc kubenswrapper[4959]: E1007 14:28:26.809747 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:28:38 crc kubenswrapper[4959]: I1007 14:28:38.817532 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:28:38 crc kubenswrapper[4959]: E1007 14:28:38.818336 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:28:53 crc kubenswrapper[4959]: I1007 14:28:53.808588 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:28:53 crc kubenswrapper[4959]: E1007 14:28:53.809418 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:29:08 crc kubenswrapper[4959]: I1007 14:29:08.823910 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:29:08 crc kubenswrapper[4959]: E1007 14:29:08.825572 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.472270 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2nls2"] Oct 07 14:29:19 crc kubenswrapper[4959]: E1007 14:29:19.473207 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="registry-server" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.473227 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="registry-server" Oct 07 14:29:19 crc kubenswrapper[4959]: E1007 14:29:19.473269 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="extract-utilities" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.473280 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="extract-utilities" Oct 07 14:29:19 crc kubenswrapper[4959]: E1007 14:29:19.473302 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="extract-content" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.473310 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="extract-content" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.473513 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3a9b1e-82b9-4c89-be7e-b87629c239e0" containerName="registry-server" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.475255 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.495149 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nls2"] Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.578846 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-catalog-content\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.579196 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgwml\" (UniqueName: \"kubernetes.io/projected/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-kube-api-access-tgwml\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.579251 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-utilities\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.681198 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-catalog-content\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.681298 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgwml\" (UniqueName: \"kubernetes.io/projected/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-kube-api-access-tgwml\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.681367 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-utilities\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.682204 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-utilities\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.682496 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-catalog-content\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.715981 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgwml\" (UniqueName: \"kubernetes.io/projected/73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b-kube-api-access-tgwml\") pod \"redhat-operators-2nls2\" (UID: \"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b\") " pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:19 crc kubenswrapper[4959]: I1007 14:29:19.799605 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:20 crc kubenswrapper[4959]: I1007 14:29:20.331816 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nls2"] Oct 07 14:29:20 crc kubenswrapper[4959]: I1007 14:29:20.808916 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:29:20 crc kubenswrapper[4959]: E1007 14:29:20.809481 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:29:21 crc kubenswrapper[4959]: I1007 14:29:21.200446 4959 generic.go:334] "Generic (PLEG): container finished" podID="73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b" containerID="42050d1de4b0bf5d47fc5a1088f238ce0ea2fd7371b2a18eb9b5fa466eafe5e7" exitCode=0 Oct 07 14:29:21 crc kubenswrapper[4959]: I1007 14:29:21.200533 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nls2" event={"ID":"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b","Type":"ContainerDied","Data":"42050d1de4b0bf5d47fc5a1088f238ce0ea2fd7371b2a18eb9b5fa466eafe5e7"} Oct 07 14:29:21 crc kubenswrapper[4959]: I1007 14:29:21.201308 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nls2" event={"ID":"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b","Type":"ContainerStarted","Data":"8fbedb47b352da5669e90c5a54f90269553802cf9c1cc678d0e5f62d7d938059"} Oct 07 14:29:21 crc kubenswrapper[4959]: I1007 14:29:21.204012 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:29:33 crc kubenswrapper[4959]: I1007 14:29:33.808253 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:29:33 crc kubenswrapper[4959]: E1007 14:29:33.809060 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:29:36 crc kubenswrapper[4959]: I1007 14:29:36.345402 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nls2" event={"ID":"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b","Type":"ContainerStarted","Data":"d901345455c30a256c275a602fe321abcff9239fbd600d3a544e210c17f12403"} Oct 07 14:29:39 crc kubenswrapper[4959]: I1007 14:29:39.373460 4959 generic.go:334] "Generic (PLEG): container finished" podID="73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b" containerID="d901345455c30a256c275a602fe321abcff9239fbd600d3a544e210c17f12403" exitCode=0 Oct 07 14:29:39 crc kubenswrapper[4959]: I1007 14:29:39.373535 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nls2" event={"ID":"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b","Type":"ContainerDied","Data":"d901345455c30a256c275a602fe321abcff9239fbd600d3a544e210c17f12403"} Oct 07 14:29:40 crc kubenswrapper[4959]: I1007 14:29:40.387339 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nls2" event={"ID":"73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b","Type":"ContainerStarted","Data":"f2f7601dce1ce8f099edf360412e1386b14e4cf560217ac76c5eb84dd7422a8e"} Oct 07 14:29:40 crc kubenswrapper[4959]: I1007 14:29:40.411894 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2nls2" podStartSLOduration=2.852771558 podStartE2EDuration="21.411865721s" podCreationTimestamp="2025-10-07 14:29:19 +0000 UTC" firstStartedPulling="2025-10-07 14:29:21.203717861 +0000 UTC m=+5313.364440548" lastFinishedPulling="2025-10-07 14:29:39.762812034 +0000 UTC m=+5331.923534711" observedRunningTime="2025-10-07 14:29:40.410826031 +0000 UTC m=+5332.571548728" watchObservedRunningTime="2025-10-07 14:29:40.411865721 +0000 UTC m=+5332.572588398" Oct 07 14:29:45 crc kubenswrapper[4959]: I1007 14:29:45.809111 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:29:45 crc kubenswrapper[4959]: E1007 14:29:45.810174 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:29:49 crc kubenswrapper[4959]: I1007 14:29:49.800262 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:49 crc kubenswrapper[4959]: I1007 14:29:49.801159 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:49 crc kubenswrapper[4959]: I1007 14:29:49.850828 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:50 crc kubenswrapper[4959]: I1007 14:29:50.545997 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2nls2" Oct 07 14:29:50 crc kubenswrapper[4959]: I1007 14:29:50.613036 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nls2"] Oct 07 14:29:50 crc kubenswrapper[4959]: I1007 14:29:50.679491 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 14:29:50 crc kubenswrapper[4959]: I1007 14:29:50.680068 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pf2tx" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="registry-server" containerID="cri-o://8602481403ace8c53d45347365090f3ca2ef82c8dea1d645828175c5eb3eebb7" gracePeriod=2 Oct 07 14:29:51 crc kubenswrapper[4959]: I1007 14:29:51.527303 4959 generic.go:334] "Generic (PLEG): container finished" podID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerID="8602481403ace8c53d45347365090f3ca2ef82c8dea1d645828175c5eb3eebb7" exitCode=0 Oct 07 14:29:51 crc kubenswrapper[4959]: I1007 14:29:51.527549 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerDied","Data":"8602481403ace8c53d45347365090f3ca2ef82c8dea1d645828175c5eb3eebb7"} Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.403742 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.479814 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wwv5\" (UniqueName: \"kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5\") pod \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.479892 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content\") pod \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.480226 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities\") pod \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\" (UID: \"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf\") " Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.498782 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities" (OuterVolumeSpecName: "utilities") pod "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" (UID: "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.532796 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5" (OuterVolumeSpecName: "kube-api-access-5wwv5") pod "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" (UID: "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf"). InnerVolumeSpecName "kube-api-access-5wwv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.541317 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf2tx" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.541762 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf2tx" event={"ID":"2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf","Type":"ContainerDied","Data":"f9aec09891a36d308dc0775f04bc1dfa0e0afdf06998df3cd5b91c33115660c5"} Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.542473 4959 scope.go:117] "RemoveContainer" containerID="8602481403ace8c53d45347365090f3ca2ef82c8dea1d645828175c5eb3eebb7" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.586225 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wwv5\" (UniqueName: \"kubernetes.io/projected/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-kube-api-access-5wwv5\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.586328 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.613427 4959 scope.go:117] "RemoveContainer" containerID="97d5d5aa99b2f363082c35d1c7fe7ee06d8eccd3497622f5bf838ea6f0e508f7" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.681646 4959 scope.go:117] "RemoveContainer" containerID="a55b485333f098d979a21beaecc0a14acfcce37c94af187bdceae66d2335982d" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.796643 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" (UID: "2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.869789 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.890647 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pf2tx"] Oct 07 14:29:52 crc kubenswrapper[4959]: I1007 14:29:52.891853 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:54 crc kubenswrapper[4959]: I1007 14:29:54.819577 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" path="/var/lib/kubelet/pods/2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf/volumes" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.140610 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk"] Oct 07 14:30:00 crc kubenswrapper[4959]: E1007 14:30:00.141744 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.141787 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4959]: E1007 14:30:00.141816 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="extract-content" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.141824 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="extract-content" Oct 07 14:30:00 crc kubenswrapper[4959]: E1007 14:30:00.141845 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="extract-utilities" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.141856 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="extract-utilities" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.142169 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dccb71a-80aa-49d3-bcc3-b1713a7bfbdf" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.143008 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.146491 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.149857 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.152074 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk"] Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.224313 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dv2k\" (UniqueName: \"kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.224725 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.224772 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.327084 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.327159 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.327231 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dv2k\" (UniqueName: \"kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.328331 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.381135 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.381188 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dv2k\" (UniqueName: \"kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k\") pod \"collect-profiles-29330790-kb2lk\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.463411 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.825592 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:30:00 crc kubenswrapper[4959]: E1007 14:30:00.827211 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:30:00 crc kubenswrapper[4959]: I1007 14:30:00.956491 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk"] Oct 07 14:30:01 crc kubenswrapper[4959]: I1007 14:30:01.638223 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" event={"ID":"59e8a0c0-5963-4abf-bc69-a015bf7e0064","Type":"ContainerStarted","Data":"f42a54ed3c6e9321bb4da3c3d693b4b8a546006bca1900b6ee72fc1e0482cc40"} Oct 07 14:30:01 crc kubenswrapper[4959]: I1007 14:30:01.638475 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" event={"ID":"59e8a0c0-5963-4abf-bc69-a015bf7e0064","Type":"ContainerStarted","Data":"39f9eb023ce64499443086eb90aaf4512193d26e0e95fac9d7ae08f54976a3e8"} Oct 07 14:30:02 crc kubenswrapper[4959]: I1007 14:30:02.649025 4959 generic.go:334] "Generic (PLEG): container finished" podID="59e8a0c0-5963-4abf-bc69-a015bf7e0064" containerID="f42a54ed3c6e9321bb4da3c3d693b4b8a546006bca1900b6ee72fc1e0482cc40" exitCode=0 Oct 07 14:30:02 crc kubenswrapper[4959]: I1007 14:30:02.649112 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" event={"ID":"59e8a0c0-5963-4abf-bc69-a015bf7e0064","Type":"ContainerDied","Data":"f42a54ed3c6e9321bb4da3c3d693b4b8a546006bca1900b6ee72fc1e0482cc40"} Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.070571 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.081849 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume\") pod \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.082022 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dv2k\" (UniqueName: \"kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k\") pod \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.082214 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume\") pod \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\" (UID: \"59e8a0c0-5963-4abf-bc69-a015bf7e0064\") " Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.083137 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume" (OuterVolumeSpecName: "config-volume") pod "59e8a0c0-5963-4abf-bc69-a015bf7e0064" (UID: "59e8a0c0-5963-4abf-bc69-a015bf7e0064"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.150036 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k" (OuterVolumeSpecName: "kube-api-access-7dv2k") pod "59e8a0c0-5963-4abf-bc69-a015bf7e0064" (UID: "59e8a0c0-5963-4abf-bc69-a015bf7e0064"). InnerVolumeSpecName "kube-api-access-7dv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.150433 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59e8a0c0-5963-4abf-bc69-a015bf7e0064" (UID: "59e8a0c0-5963-4abf-bc69-a015bf7e0064"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.184674 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59e8a0c0-5963-4abf-bc69-a015bf7e0064-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.184720 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dv2k\" (UniqueName: \"kubernetes.io/projected/59e8a0c0-5963-4abf-bc69-a015bf7e0064-kube-api-access-7dv2k\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.184733 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e8a0c0-5963-4abf-bc69-a015bf7e0064-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.659699 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" event={"ID":"59e8a0c0-5963-4abf-bc69-a015bf7e0064","Type":"ContainerDied","Data":"39f9eb023ce64499443086eb90aaf4512193d26e0e95fac9d7ae08f54976a3e8"} Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.659737 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f9eb023ce64499443086eb90aaf4512193d26e0e95fac9d7ae08f54976a3e8" Oct 07 14:30:03 crc kubenswrapper[4959]: I1007 14:30:03.659817 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk" Oct 07 14:30:04 crc kubenswrapper[4959]: I1007 14:30:04.165914 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9"] Oct 07 14:30:04 crc kubenswrapper[4959]: I1007 14:30:04.176567 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-pdqc9"] Oct 07 14:30:04 crc kubenswrapper[4959]: I1007 14:30:04.823965 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6a560c-ff3d-432a-8db0-c51d43ce4082" path="/var/lib/kubelet/pods/ba6a560c-ff3d-432a-8db0-c51d43ce4082/volumes" Oct 07 14:30:15 crc kubenswrapper[4959]: I1007 14:30:15.810313 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:30:15 crc kubenswrapper[4959]: E1007 14:30:15.811687 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:30:27 crc kubenswrapper[4959]: I1007 14:30:27.809701 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:30:27 crc kubenswrapper[4959]: E1007 14:30:27.810526 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:30:40 crc kubenswrapper[4959]: I1007 14:30:40.809716 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:30:40 crc kubenswrapper[4959]: E1007 14:30:40.810611 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:30:51 crc kubenswrapper[4959]: I1007 14:30:51.809516 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:30:51 crc kubenswrapper[4959]: E1007 14:30:51.810197 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:30:55 crc kubenswrapper[4959]: I1007 14:30:55.803524 4959 scope.go:117] "RemoveContainer" containerID="2b1dac7f04b7d257a085580a3e39894532d2f35eaa2de437d9a8685fb1a766ab" Oct 07 14:31:04 crc kubenswrapper[4959]: I1007 14:31:04.811149 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:31:04 crc kubenswrapper[4959]: E1007 14:31:04.812036 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:31:18 crc kubenswrapper[4959]: I1007 14:31:18.814583 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:31:18 crc kubenswrapper[4959]: E1007 14:31:18.815492 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:31:30 crc kubenswrapper[4959]: I1007 14:31:30.809216 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:31:30 crc kubenswrapper[4959]: E1007 14:31:30.809894 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:31:43 crc kubenswrapper[4959]: I1007 14:31:43.809363 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:31:43 crc kubenswrapper[4959]: E1007 14:31:43.810020 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:31:55 crc kubenswrapper[4959]: I1007 14:31:55.809283 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:31:55 crc kubenswrapper[4959]: E1007 14:31:55.810149 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:32:10 crc kubenswrapper[4959]: I1007 14:32:10.819082 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:32:11 crc kubenswrapper[4959]: I1007 14:32:11.849418 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52"} Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.188945 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:06 crc kubenswrapper[4959]: E1007 14:34:06.190601 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e8a0c0-5963-4abf-bc69-a015bf7e0064" containerName="collect-profiles" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.190640 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e8a0c0-5963-4abf-bc69-a015bf7e0064" containerName="collect-profiles" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.190907 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e8a0c0-5963-4abf-bc69-a015bf7e0064" containerName="collect-profiles" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.193272 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.204157 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.322967 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.323033 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.323099 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwsk\" (UniqueName: \"kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.425218 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plwsk\" (UniqueName: \"kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.425434 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.425460 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.426040 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.426093 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.456005 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwsk\" (UniqueName: \"kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk\") pod \"community-operators-522cm\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:06 crc kubenswrapper[4959]: I1007 14:34:06.532600 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:07 crc kubenswrapper[4959]: I1007 14:34:07.213383 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:07 crc kubenswrapper[4959]: I1007 14:34:07.905759 4959 generic.go:334] "Generic (PLEG): container finished" podID="24498462-5866-48f3-be02-954383a58839" containerID="15b83f595fdfc1080613f852db3d20b5a666f35e67d547283d8fe341df3ce079" exitCode=0 Oct 07 14:34:07 crc kubenswrapper[4959]: I1007 14:34:07.906135 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerDied","Data":"15b83f595fdfc1080613f852db3d20b5a666f35e67d547283d8fe341df3ce079"} Oct 07 14:34:07 crc kubenswrapper[4959]: I1007 14:34:07.906197 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerStarted","Data":"5121679c58555a6c05e32dfb64c55997caf06b87286adb1a53efd6607950299c"} Oct 07 14:34:09 crc kubenswrapper[4959]: I1007 14:34:09.926502 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerStarted","Data":"4aae2fcf138ea49d16e293b9a766530636e9270e2aca759f85db70b074ea9c81"} Oct 07 14:34:12 crc kubenswrapper[4959]: I1007 14:34:12.968557 4959 generic.go:334] "Generic (PLEG): container finished" podID="24498462-5866-48f3-be02-954383a58839" containerID="4aae2fcf138ea49d16e293b9a766530636e9270e2aca759f85db70b074ea9c81" exitCode=0 Oct 07 14:34:12 crc kubenswrapper[4959]: I1007 14:34:12.968648 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerDied","Data":"4aae2fcf138ea49d16e293b9a766530636e9270e2aca759f85db70b074ea9c81"} Oct 07 14:34:14 crc kubenswrapper[4959]: I1007 14:34:14.990142 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerStarted","Data":"2acbb53b4d7f949202dbc89117b0955d37a4c39c18e9aff5cf30c9898841b948"} Oct 07 14:34:15 crc kubenswrapper[4959]: I1007 14:34:15.023327 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-522cm" podStartSLOduration=3.040077002 podStartE2EDuration="9.023304274s" podCreationTimestamp="2025-10-07 14:34:06 +0000 UTC" firstStartedPulling="2025-10-07 14:34:07.911783625 +0000 UTC m=+5600.072506312" lastFinishedPulling="2025-10-07 14:34:13.895010907 +0000 UTC m=+5606.055733584" observedRunningTime="2025-10-07 14:34:15.014453791 +0000 UTC m=+5607.175176478" watchObservedRunningTime="2025-10-07 14:34:15.023304274 +0000 UTC m=+5607.184026951" Oct 07 14:34:16 crc kubenswrapper[4959]: I1007 14:34:16.533738 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:16 crc kubenswrapper[4959]: I1007 14:34:16.534161 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:16 crc kubenswrapper[4959]: I1007 14:34:16.593507 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:26 crc kubenswrapper[4959]: I1007 14:34:26.583734 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:26 crc kubenswrapper[4959]: I1007 14:34:26.634005 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:27 crc kubenswrapper[4959]: I1007 14:34:27.094548 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-522cm" podUID="24498462-5866-48f3-be02-954383a58839" containerName="registry-server" containerID="cri-o://2acbb53b4d7f949202dbc89117b0955d37a4c39c18e9aff5cf30c9898841b948" gracePeriod=2 Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.115432 4959 generic.go:334] "Generic (PLEG): container finished" podID="24498462-5866-48f3-be02-954383a58839" containerID="2acbb53b4d7f949202dbc89117b0955d37a4c39c18e9aff5cf30c9898841b948" exitCode=0 Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.115497 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerDied","Data":"2acbb53b4d7f949202dbc89117b0955d37a4c39c18e9aff5cf30c9898841b948"} Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.359285 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.419428 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plwsk\" (UniqueName: \"kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk\") pod \"24498462-5866-48f3-be02-954383a58839\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.420011 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities\") pod \"24498462-5866-48f3-be02-954383a58839\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.420120 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content\") pod \"24498462-5866-48f3-be02-954383a58839\" (UID: \"24498462-5866-48f3-be02-954383a58839\") " Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.420849 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities" (OuterVolumeSpecName: "utilities") pod "24498462-5866-48f3-be02-954383a58839" (UID: "24498462-5866-48f3-be02-954383a58839"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.442233 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk" (OuterVolumeSpecName: "kube-api-access-plwsk") pod "24498462-5866-48f3-be02-954383a58839" (UID: "24498462-5866-48f3-be02-954383a58839"). InnerVolumeSpecName "kube-api-access-plwsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.473275 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24498462-5866-48f3-be02-954383a58839" (UID: "24498462-5866-48f3-be02-954383a58839"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.523223 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.523273 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plwsk\" (UniqueName: \"kubernetes.io/projected/24498462-5866-48f3-be02-954383a58839-kube-api-access-plwsk\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:28 crc kubenswrapper[4959]: I1007 14:34:28.523291 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24498462-5866-48f3-be02-954383a58839-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.126211 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-522cm" event={"ID":"24498462-5866-48f3-be02-954383a58839","Type":"ContainerDied","Data":"5121679c58555a6c05e32dfb64c55997caf06b87286adb1a53efd6607950299c"} Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.126277 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-522cm" Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.126285 4959 scope.go:117] "RemoveContainer" containerID="2acbb53b4d7f949202dbc89117b0955d37a4c39c18e9aff5cf30c9898841b948" Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.156319 4959 scope.go:117] "RemoveContainer" containerID="4aae2fcf138ea49d16e293b9a766530636e9270e2aca759f85db70b074ea9c81" Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.191922 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.194618 4959 scope.go:117] "RemoveContainer" containerID="15b83f595fdfc1080613f852db3d20b5a666f35e67d547283d8fe341df3ce079" Oct 07 14:34:29 crc kubenswrapper[4959]: I1007 14:34:29.199892 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-522cm"] Oct 07 14:34:30 crc kubenswrapper[4959]: I1007 14:34:30.820127 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24498462-5866-48f3-be02-954383a58839" path="/var/lib/kubelet/pods/24498462-5866-48f3-be02-954383a58839/volumes" Oct 07 14:34:37 crc kubenswrapper[4959]: I1007 14:34:37.696179 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:34:37 crc kubenswrapper[4959]: I1007 14:34:37.697619 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:35:07 crc kubenswrapper[4959]: I1007 14:35:07.695768 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:35:07 crc kubenswrapper[4959]: I1007 14:35:07.696394 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:35:37 crc kubenswrapper[4959]: I1007 14:35:37.695950 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:35:37 crc kubenswrapper[4959]: I1007 14:35:37.696424 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:35:37 crc kubenswrapper[4959]: I1007 14:35:37.696470 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:35:37 crc kubenswrapper[4959]: I1007 14:35:37.697282 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:35:37 crc kubenswrapper[4959]: I1007 14:35:37.697340 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52" gracePeriod=600 Oct 07 14:35:38 crc kubenswrapper[4959]: I1007 14:35:38.770169 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52" exitCode=0 Oct 07 14:35:38 crc kubenswrapper[4959]: I1007 14:35:38.770212 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52"} Oct 07 14:35:38 crc kubenswrapper[4959]: I1007 14:35:38.771374 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7"} Oct 07 14:35:38 crc kubenswrapper[4959]: I1007 14:35:38.771414 4959 scope.go:117] "RemoveContainer" containerID="ca8d460d6728c8257d38f6c7a448d09806bf5280a3bc65dc046d2704e5efd69e" Oct 07 14:38:07 crc kubenswrapper[4959]: I1007 14:38:07.695422 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:38:07 crc kubenswrapper[4959]: I1007 14:38:07.695970 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.051522 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:34 crc kubenswrapper[4959]: E1007 14:38:34.053082 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24498462-5866-48f3-be02-954383a58839" containerName="extract-utilities" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.053103 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="24498462-5866-48f3-be02-954383a58839" containerName="extract-utilities" Oct 07 14:38:34 crc kubenswrapper[4959]: E1007 14:38:34.053123 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24498462-5866-48f3-be02-954383a58839" containerName="registry-server" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.053129 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="24498462-5866-48f3-be02-954383a58839" containerName="registry-server" Oct 07 14:38:34 crc kubenswrapper[4959]: E1007 14:38:34.053144 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24498462-5866-48f3-be02-954383a58839" containerName="extract-content" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.053151 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="24498462-5866-48f3-be02-954383a58839" containerName="extract-content" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.053391 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="24498462-5866-48f3-be02-954383a58839" containerName="registry-server" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.055345 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.063771 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.147183 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.147452 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vss6x\" (UniqueName: \"kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.147496 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.249934 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vss6x\" (UniqueName: \"kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.249979 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.250051 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.250641 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.250817 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.281478 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vss6x\" (UniqueName: \"kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x\") pod \"redhat-marketplace-2p42r\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:34 crc kubenswrapper[4959]: I1007 14:38:34.379168 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:35 crc kubenswrapper[4959]: I1007 14:38:35.018867 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:35 crc kubenswrapper[4959]: I1007 14:38:35.263398 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerStarted","Data":"1261734bbd7ee0fa4e3f43ca20f435c7d804268e8e936a8b0344fa8996ccaa0b"} Oct 07 14:38:36 crc kubenswrapper[4959]: I1007 14:38:36.287363 4959 generic.go:334] "Generic (PLEG): container finished" podID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerID="4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881" exitCode=0 Oct 07 14:38:36 crc kubenswrapper[4959]: I1007 14:38:36.287709 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerDied","Data":"4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881"} Oct 07 14:38:36 crc kubenswrapper[4959]: I1007 14:38:36.291084 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:38:37 crc kubenswrapper[4959]: I1007 14:38:37.695391 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:38:37 crc kubenswrapper[4959]: I1007 14:38:37.695810 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:38:38 crc kubenswrapper[4959]: I1007 14:38:38.310692 4959 generic.go:334] "Generic (PLEG): container finished" podID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerID="0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007" exitCode=0 Oct 07 14:38:38 crc kubenswrapper[4959]: I1007 14:38:38.311144 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerDied","Data":"0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007"} Oct 07 14:38:43 crc kubenswrapper[4959]: I1007 14:38:43.350108 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerStarted","Data":"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e"} Oct 07 14:38:43 crc kubenswrapper[4959]: I1007 14:38:43.374741 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2p42r" podStartSLOduration=2.807511838 podStartE2EDuration="9.374722446s" podCreationTimestamp="2025-10-07 14:38:34 +0000 UTC" firstStartedPulling="2025-10-07 14:38:36.29079287 +0000 UTC m=+5868.451515547" lastFinishedPulling="2025-10-07 14:38:42.858003478 +0000 UTC m=+5875.018726155" observedRunningTime="2025-10-07 14:38:43.370818624 +0000 UTC m=+5875.531541301" watchObservedRunningTime="2025-10-07 14:38:43.374722446 +0000 UTC m=+5875.535445123" Oct 07 14:38:44 crc kubenswrapper[4959]: I1007 14:38:44.380197 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:44 crc kubenswrapper[4959]: I1007 14:38:44.380239 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:44 crc kubenswrapper[4959]: I1007 14:38:44.438691 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:54 crc kubenswrapper[4959]: I1007 14:38:54.422575 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:54 crc kubenswrapper[4959]: I1007 14:38:54.472286 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:54 crc kubenswrapper[4959]: I1007 14:38:54.472512 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2p42r" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="registry-server" containerID="cri-o://fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e" gracePeriod=2 Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.017821 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.138505 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities\") pod \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.138743 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vss6x\" (UniqueName: \"kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x\") pod \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.138799 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content\") pod \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\" (UID: \"32ac78d2-59c6-4c7e-8dea-b235973d5fbc\") " Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.139414 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities" (OuterVolumeSpecName: "utilities") pod "32ac78d2-59c6-4c7e-8dea-b235973d5fbc" (UID: "32ac78d2-59c6-4c7e-8dea-b235973d5fbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.151673 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x" (OuterVolumeSpecName: "kube-api-access-vss6x") pod "32ac78d2-59c6-4c7e-8dea-b235973d5fbc" (UID: "32ac78d2-59c6-4c7e-8dea-b235973d5fbc"). InnerVolumeSpecName "kube-api-access-vss6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.154309 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32ac78d2-59c6-4c7e-8dea-b235973d5fbc" (UID: "32ac78d2-59c6-4c7e-8dea-b235973d5fbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.240802 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.240836 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vss6x\" (UniqueName: \"kubernetes.io/projected/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-kube-api-access-vss6x\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.240846 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32ac78d2-59c6-4c7e-8dea-b235973d5fbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.456241 4959 generic.go:334] "Generic (PLEG): container finished" podID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerID="fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e" exitCode=0 Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.456343 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p42r" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.456369 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerDied","Data":"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e"} Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.456685 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p42r" event={"ID":"32ac78d2-59c6-4c7e-8dea-b235973d5fbc","Type":"ContainerDied","Data":"1261734bbd7ee0fa4e3f43ca20f435c7d804268e8e936a8b0344fa8996ccaa0b"} Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.456708 4959 scope.go:117] "RemoveContainer" containerID="fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.486596 4959 scope.go:117] "RemoveContainer" containerID="0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.499868 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.507545 4959 scope.go:117] "RemoveContainer" containerID="4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.508129 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p42r"] Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.556429 4959 scope.go:117] "RemoveContainer" containerID="fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e" Oct 07 14:38:55 crc kubenswrapper[4959]: E1007 14:38:55.556916 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e\": container with ID starting with fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e not found: ID does not exist" containerID="fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.556956 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e"} err="failed to get container status \"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e\": rpc error: code = NotFound desc = could not find container \"fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e\": container with ID starting with fe438831ef4900c27234f486dfab7fa9ddc83810fa8e373f13383b0b0fb89d4e not found: ID does not exist" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.556986 4959 scope.go:117] "RemoveContainer" containerID="0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007" Oct 07 14:38:55 crc kubenswrapper[4959]: E1007 14:38:55.557372 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007\": container with ID starting with 0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007 not found: ID does not exist" containerID="0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.557415 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007"} err="failed to get container status \"0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007\": rpc error: code = NotFound desc = could not find container \"0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007\": container with ID starting with 0fc6febcbb9f1eb6be1029e9abffa4835cad7f98865ddbe6ab4be1f3ba986007 not found: ID does not exist" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.557444 4959 scope.go:117] "RemoveContainer" containerID="4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881" Oct 07 14:38:55 crc kubenswrapper[4959]: E1007 14:38:55.557854 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881\": container with ID starting with 4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881 not found: ID does not exist" containerID="4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881" Oct 07 14:38:55 crc kubenswrapper[4959]: I1007 14:38:55.557916 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881"} err="failed to get container status \"4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881\": rpc error: code = NotFound desc = could not find container \"4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881\": container with ID starting with 4d09c3b8189c548374b30f418aae756fc2dd3b06dc837ea7c62b48e53a70d881 not found: ID does not exist" Oct 07 14:38:56 crc kubenswrapper[4959]: I1007 14:38:56.821140 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" path="/var/lib/kubelet/pods/32ac78d2-59c6-4c7e-8dea-b235973d5fbc/volumes" Oct 07 14:39:07 crc kubenswrapper[4959]: I1007 14:39:07.698145 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:39:07 crc kubenswrapper[4959]: I1007 14:39:07.700471 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:39:07 crc kubenswrapper[4959]: I1007 14:39:07.700607 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:39:07 crc kubenswrapper[4959]: I1007 14:39:07.701416 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:39:07 crc kubenswrapper[4959]: I1007 14:39:07.701593 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" gracePeriod=600 Oct 07 14:39:07 crc kubenswrapper[4959]: E1007 14:39:07.835002 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:39:08 crc kubenswrapper[4959]: I1007 14:39:08.573591 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" exitCode=0 Oct 07 14:39:08 crc kubenswrapper[4959]: I1007 14:39:08.573657 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7"} Oct 07 14:39:08 crc kubenswrapper[4959]: I1007 14:39:08.573975 4959 scope.go:117] "RemoveContainer" containerID="d9649276f7ea05112e664c1277f499770915cec94da54645e2bbd70215ba3b52" Oct 07 14:39:08 crc kubenswrapper[4959]: I1007 14:39:08.574785 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:39:08 crc kubenswrapper[4959]: E1007 14:39:08.575219 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:39:21 crc kubenswrapper[4959]: I1007 14:39:21.808536 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:39:21 crc kubenswrapper[4959]: E1007 14:39:21.809331 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:39:33 crc kubenswrapper[4959]: I1007 14:39:33.809533 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:39:33 crc kubenswrapper[4959]: E1007 14:39:33.810494 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:39:48 crc kubenswrapper[4959]: I1007 14:39:48.818664 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:39:48 crc kubenswrapper[4959]: E1007 14:39:48.819785 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:39:59 crc kubenswrapper[4959]: I1007 14:39:59.810369 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:39:59 crc kubenswrapper[4959]: E1007 14:39:59.811314 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.706357 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:05 crc kubenswrapper[4959]: E1007 14:40:05.707410 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="extract-content" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.707426 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="extract-content" Oct 07 14:40:05 crc kubenswrapper[4959]: E1007 14:40:05.707448 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="extract-utilities" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.707456 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="extract-utilities" Oct 07 14:40:05 crc kubenswrapper[4959]: E1007 14:40:05.707487 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="registry-server" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.707494 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="registry-server" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.707778 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ac78d2-59c6-4c7e-8dea-b235973d5fbc" containerName="registry-server" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.709426 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.718765 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.832885 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.832998 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.833050 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjhx\" (UniqueName: \"kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.935527 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.935738 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.935786 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjhx\" (UniqueName: \"kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.936710 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.936823 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:05 crc kubenswrapper[4959]: I1007 14:40:05.966651 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjhx\" (UniqueName: \"kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx\") pod \"redhat-operators-j42sd\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:06 crc kubenswrapper[4959]: I1007 14:40:06.028001 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:06 crc kubenswrapper[4959]: W1007 14:40:06.574450 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48087522_736e_4c7f_9619_705b1beb5c7d.slice/crio-9ae97d31722201fac542486ab37e61b20817458237171ebb6bd207dbfc6c7e49 WatchSource:0}: Error finding container 9ae97d31722201fac542486ab37e61b20817458237171ebb6bd207dbfc6c7e49: Status 404 returned error can't find the container with id 9ae97d31722201fac542486ab37e61b20817458237171ebb6bd207dbfc6c7e49 Oct 07 14:40:06 crc kubenswrapper[4959]: I1007 14:40:06.580717 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:07 crc kubenswrapper[4959]: I1007 14:40:07.107928 4959 generic.go:334] "Generic (PLEG): container finished" podID="48087522-736e-4c7f-9619-705b1beb5c7d" containerID="d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212" exitCode=0 Oct 07 14:40:07 crc kubenswrapper[4959]: I1007 14:40:07.107975 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerDied","Data":"d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212"} Oct 07 14:40:07 crc kubenswrapper[4959]: I1007 14:40:07.108001 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerStarted","Data":"9ae97d31722201fac542486ab37e61b20817458237171ebb6bd207dbfc6c7e49"} Oct 07 14:40:09 crc kubenswrapper[4959]: I1007 14:40:09.135751 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerStarted","Data":"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395"} Oct 07 14:40:12 crc kubenswrapper[4959]: I1007 14:40:12.167749 4959 generic.go:334] "Generic (PLEG): container finished" podID="48087522-736e-4c7f-9619-705b1beb5c7d" containerID="0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395" exitCode=0 Oct 07 14:40:12 crc kubenswrapper[4959]: I1007 14:40:12.167884 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerDied","Data":"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395"} Oct 07 14:40:13 crc kubenswrapper[4959]: I1007 14:40:13.181812 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerStarted","Data":"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403"} Oct 07 14:40:13 crc kubenswrapper[4959]: I1007 14:40:13.209068 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j42sd" podStartSLOduration=2.644704386 podStartE2EDuration="8.209045967s" podCreationTimestamp="2025-10-07 14:40:05 +0000 UTC" firstStartedPulling="2025-10-07 14:40:07.11341291 +0000 UTC m=+5959.274135587" lastFinishedPulling="2025-10-07 14:40:12.677754491 +0000 UTC m=+5964.838477168" observedRunningTime="2025-10-07 14:40:13.203120827 +0000 UTC m=+5965.363843524" watchObservedRunningTime="2025-10-07 14:40:13.209045967 +0000 UTC m=+5965.369768644" Oct 07 14:40:13 crc kubenswrapper[4959]: I1007 14:40:13.809498 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:40:13 crc kubenswrapper[4959]: E1007 14:40:13.809911 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:40:16 crc kubenswrapper[4959]: I1007 14:40:16.028381 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:16 crc kubenswrapper[4959]: I1007 14:40:16.028737 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:17 crc kubenswrapper[4959]: I1007 14:40:17.078290 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j42sd" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="registry-server" probeResult="failure" output=< Oct 07 14:40:17 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 14:40:17 crc kubenswrapper[4959]: > Oct 07 14:40:25 crc kubenswrapper[4959]: I1007 14:40:25.809144 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:40:25 crc kubenswrapper[4959]: E1007 14:40:25.810003 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:40:26 crc kubenswrapper[4959]: I1007 14:40:26.080230 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:26 crc kubenswrapper[4959]: I1007 14:40:26.131814 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:26 crc kubenswrapper[4959]: I1007 14:40:26.317124 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:27 crc kubenswrapper[4959]: I1007 14:40:27.294864 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j42sd" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="registry-server" containerID="cri-o://e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403" gracePeriod=2 Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.069400 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.218264 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjhx\" (UniqueName: \"kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx\") pod \"48087522-736e-4c7f-9619-705b1beb5c7d\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.218867 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities\") pod \"48087522-736e-4c7f-9619-705b1beb5c7d\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.218969 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content\") pod \"48087522-736e-4c7f-9619-705b1beb5c7d\" (UID: \"48087522-736e-4c7f-9619-705b1beb5c7d\") " Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.219714 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities" (OuterVolumeSpecName: "utilities") pod "48087522-736e-4c7f-9619-705b1beb5c7d" (UID: "48087522-736e-4c7f-9619-705b1beb5c7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.230001 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx" (OuterVolumeSpecName: "kube-api-access-kzjhx") pod "48087522-736e-4c7f-9619-705b1beb5c7d" (UID: "48087522-736e-4c7f-9619-705b1beb5c7d"). InnerVolumeSpecName "kube-api-access-kzjhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.311539 4959 generic.go:334] "Generic (PLEG): container finished" podID="48087522-736e-4c7f-9619-705b1beb5c7d" containerID="e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403" exitCode=0 Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.311596 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j42sd" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.311607 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerDied","Data":"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403"} Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.311687 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j42sd" event={"ID":"48087522-736e-4c7f-9619-705b1beb5c7d","Type":"ContainerDied","Data":"9ae97d31722201fac542486ab37e61b20817458237171ebb6bd207dbfc6c7e49"} Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.311710 4959 scope.go:117] "RemoveContainer" containerID="e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.321712 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjhx\" (UniqueName: \"kubernetes.io/projected/48087522-736e-4c7f-9619-705b1beb5c7d-kube-api-access-kzjhx\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.321745 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.333463 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48087522-736e-4c7f-9619-705b1beb5c7d" (UID: "48087522-736e-4c7f-9619-705b1beb5c7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.341478 4959 scope.go:117] "RemoveContainer" containerID="0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.377844 4959 scope.go:117] "RemoveContainer" containerID="d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.421911 4959 scope.go:117] "RemoveContainer" containerID="e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403" Oct 07 14:40:28 crc kubenswrapper[4959]: E1007 14:40:28.422426 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403\": container with ID starting with e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403 not found: ID does not exist" containerID="e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.422465 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403"} err="failed to get container status \"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403\": rpc error: code = NotFound desc = could not find container \"e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403\": container with ID starting with e0c7c2a9013f695e753214740364268dab417950817817e16b4c3dbc745dd403 not found: ID does not exist" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.422494 4959 scope.go:117] "RemoveContainer" containerID="0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395" Oct 07 14:40:28 crc kubenswrapper[4959]: E1007 14:40:28.422930 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395\": container with ID starting with 0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395 not found: ID does not exist" containerID="0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.422964 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395"} err="failed to get container status \"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395\": rpc error: code = NotFound desc = could not find container \"0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395\": container with ID starting with 0ae58cada8030343990f38b181e7f2a320b85c5dba7a887afd071371b1c77395 not found: ID does not exist" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.422982 4959 scope.go:117] "RemoveContainer" containerID="d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212" Oct 07 14:40:28 crc kubenswrapper[4959]: E1007 14:40:28.423469 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212\": container with ID starting with d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212 not found: ID does not exist" containerID="d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.423499 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212"} err="failed to get container status \"d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212\": rpc error: code = NotFound desc = could not find container \"d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212\": container with ID starting with d3df0e58afc71c20543c30f681fb524f187dcba308797949fbb6761dbd4de212 not found: ID does not exist" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.423620 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48087522-736e-4c7f-9619-705b1beb5c7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.648180 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.660964 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j42sd"] Oct 07 14:40:28 crc kubenswrapper[4959]: I1007 14:40:28.823415 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" path="/var/lib/kubelet/pods/48087522-736e-4c7f-9619-705b1beb5c7d/volumes" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.322256 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:34 crc kubenswrapper[4959]: E1007 14:40:34.323202 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="registry-server" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.323216 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="registry-server" Oct 07 14:40:34 crc kubenswrapper[4959]: E1007 14:40:34.323235 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="extract-utilities" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.323242 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="extract-utilities" Oct 07 14:40:34 crc kubenswrapper[4959]: E1007 14:40:34.323268 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="extract-content" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.323274 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="extract-content" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.323452 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="48087522-736e-4c7f-9619-705b1beb5c7d" containerName="registry-server" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.324863 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.336285 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.444877 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrzr\" (UniqueName: \"kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.444944 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.445076 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.546741 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrzr\" (UniqueName: \"kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.547144 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.547334 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.547645 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.547859 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.565179 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrzr\" (UniqueName: \"kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr\") pod \"certified-operators-qghqg\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:34 crc kubenswrapper[4959]: I1007 14:40:34.651042 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:35 crc kubenswrapper[4959]: I1007 14:40:35.153569 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:35 crc kubenswrapper[4959]: I1007 14:40:35.393053 4959 generic.go:334] "Generic (PLEG): container finished" podID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerID="4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6" exitCode=0 Oct 07 14:40:35 crc kubenswrapper[4959]: I1007 14:40:35.393097 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerDied","Data":"4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6"} Oct 07 14:40:35 crc kubenswrapper[4959]: I1007 14:40:35.393124 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerStarted","Data":"a41a51c6fcbbadb37a037890e04798a75179f60da7ecd0f6fad6b152cab0379e"} Oct 07 14:40:36 crc kubenswrapper[4959]: I1007 14:40:36.809324 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:40:36 crc kubenswrapper[4959]: E1007 14:40:36.811166 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:40:37 crc kubenswrapper[4959]: I1007 14:40:37.448135 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerStarted","Data":"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b"} Oct 07 14:40:38 crc kubenswrapper[4959]: I1007 14:40:38.458608 4959 generic.go:334] "Generic (PLEG): container finished" podID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerID="59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b" exitCode=0 Oct 07 14:40:38 crc kubenswrapper[4959]: I1007 14:40:38.458677 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerDied","Data":"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b"} Oct 07 14:40:39 crc kubenswrapper[4959]: I1007 14:40:39.474114 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerStarted","Data":"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6"} Oct 07 14:40:39 crc kubenswrapper[4959]: I1007 14:40:39.495217 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qghqg" podStartSLOduration=1.820081603 podStartE2EDuration="5.495193402s" podCreationTimestamp="2025-10-07 14:40:34 +0000 UTC" firstStartedPulling="2025-10-07 14:40:35.394792159 +0000 UTC m=+5987.555514836" lastFinishedPulling="2025-10-07 14:40:39.069903958 +0000 UTC m=+5991.230626635" observedRunningTime="2025-10-07 14:40:39.491719202 +0000 UTC m=+5991.652441899" watchObservedRunningTime="2025-10-07 14:40:39.495193402 +0000 UTC m=+5991.655916079" Oct 07 14:40:44 crc kubenswrapper[4959]: I1007 14:40:44.651480 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:44 crc kubenswrapper[4959]: I1007 14:40:44.652077 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:44 crc kubenswrapper[4959]: I1007 14:40:44.698274 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:45 crc kubenswrapper[4959]: I1007 14:40:45.579362 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:45 crc kubenswrapper[4959]: I1007 14:40:45.642162 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:47 crc kubenswrapper[4959]: I1007 14:40:47.537150 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qghqg" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="registry-server" containerID="cri-o://ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6" gracePeriod=2 Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.228331 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.363614 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content\") pod \"a135d7d8-db78-4a04-9364-989e60f1b2a7\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.363767 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxrzr\" (UniqueName: \"kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr\") pod \"a135d7d8-db78-4a04-9364-989e60f1b2a7\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.363824 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities\") pod \"a135d7d8-db78-4a04-9364-989e60f1b2a7\" (UID: \"a135d7d8-db78-4a04-9364-989e60f1b2a7\") " Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.366570 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities" (OuterVolumeSpecName: "utilities") pod "a135d7d8-db78-4a04-9364-989e60f1b2a7" (UID: "a135d7d8-db78-4a04-9364-989e60f1b2a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.370407 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr" (OuterVolumeSpecName: "kube-api-access-wxrzr") pod "a135d7d8-db78-4a04-9364-989e60f1b2a7" (UID: "a135d7d8-db78-4a04-9364-989e60f1b2a7"). InnerVolumeSpecName "kube-api-access-wxrzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.415318 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a135d7d8-db78-4a04-9364-989e60f1b2a7" (UID: "a135d7d8-db78-4a04-9364-989e60f1b2a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.466583 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxrzr\" (UniqueName: \"kubernetes.io/projected/a135d7d8-db78-4a04-9364-989e60f1b2a7-kube-api-access-wxrzr\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.466647 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.466659 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a135d7d8-db78-4a04-9364-989e60f1b2a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.548158 4959 generic.go:334] "Generic (PLEG): container finished" podID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerID="ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6" exitCode=0 Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.548203 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerDied","Data":"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6"} Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.548237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qghqg" event={"ID":"a135d7d8-db78-4a04-9364-989e60f1b2a7","Type":"ContainerDied","Data":"a41a51c6fcbbadb37a037890e04798a75179f60da7ecd0f6fad6b152cab0379e"} Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.548273 4959 scope.go:117] "RemoveContainer" containerID="ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.548413 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qghqg" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.579901 4959 scope.go:117] "RemoveContainer" containerID="59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.585005 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.593232 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qghqg"] Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.607454 4959 scope.go:117] "RemoveContainer" containerID="4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.658327 4959 scope.go:117] "RemoveContainer" containerID="ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6" Oct 07 14:40:48 crc kubenswrapper[4959]: E1007 14:40:48.658827 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6\": container with ID starting with ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6 not found: ID does not exist" containerID="ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.658876 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6"} err="failed to get container status \"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6\": rpc error: code = NotFound desc = could not find container \"ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6\": container with ID starting with ca4dc1bcac54da2668e10f5357fe3b698aaf927f8bb43843614c89e32200d0c6 not found: ID does not exist" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.658904 4959 scope.go:117] "RemoveContainer" containerID="59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b" Oct 07 14:40:48 crc kubenswrapper[4959]: E1007 14:40:48.659338 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b\": container with ID starting with 59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b not found: ID does not exist" containerID="59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.659369 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b"} err="failed to get container status \"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b\": rpc error: code = NotFound desc = could not find container \"59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b\": container with ID starting with 59a7ee657cc8288302e6b79aacef421406f53d29efa5bf3b8dae5b41361dce4b not found: ID does not exist" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.659393 4959 scope.go:117] "RemoveContainer" containerID="4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6" Oct 07 14:40:48 crc kubenswrapper[4959]: E1007 14:40:48.659811 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6\": container with ID starting with 4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6 not found: ID does not exist" containerID="4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.659830 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6"} err="failed to get container status \"4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6\": rpc error: code = NotFound desc = could not find container \"4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6\": container with ID starting with 4f7c8972113a824bfff016e0dcb7ac577a38b9e9a78c88385fdb975c26a212f6 not found: ID does not exist" Oct 07 14:40:48 crc kubenswrapper[4959]: I1007 14:40:48.821382 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" path="/var/lib/kubelet/pods/a135d7d8-db78-4a04-9364-989e60f1b2a7/volumes" Oct 07 14:40:51 crc kubenswrapper[4959]: I1007 14:40:51.808419 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:40:51 crc kubenswrapper[4959]: E1007 14:40:51.808937 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:41:04 crc kubenswrapper[4959]: I1007 14:41:04.809126 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:41:04 crc kubenswrapper[4959]: E1007 14:41:04.810235 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:41:15 crc kubenswrapper[4959]: I1007 14:41:15.809090 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:41:15 crc kubenswrapper[4959]: E1007 14:41:15.810026 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:41:26 crc kubenswrapper[4959]: I1007 14:41:26.809555 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:41:26 crc kubenswrapper[4959]: E1007 14:41:26.810954 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:41:40 crc kubenswrapper[4959]: I1007 14:41:40.809357 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:41:40 crc kubenswrapper[4959]: E1007 14:41:40.810541 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:41:52 crc kubenswrapper[4959]: I1007 14:41:52.809833 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:41:52 crc kubenswrapper[4959]: E1007 14:41:52.810815 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:42:04 crc kubenswrapper[4959]: I1007 14:42:04.808756 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:42:04 crc kubenswrapper[4959]: E1007 14:42:04.810040 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:42:15 crc kubenswrapper[4959]: I1007 14:42:15.809953 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:42:15 crc kubenswrapper[4959]: E1007 14:42:15.810905 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:42:27 crc kubenswrapper[4959]: I1007 14:42:27.809253 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:42:27 crc kubenswrapper[4959]: E1007 14:42:27.810068 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:42:39 crc kubenswrapper[4959]: I1007 14:42:39.809179 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:42:39 crc kubenswrapper[4959]: E1007 14:42:39.809929 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:42:52 crc kubenswrapper[4959]: I1007 14:42:52.808781 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:42:52 crc kubenswrapper[4959]: E1007 14:42:52.809653 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:43:03 crc kubenswrapper[4959]: I1007 14:43:03.810347 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:43:03 crc kubenswrapper[4959]: E1007 14:43:03.812063 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:43:17 crc kubenswrapper[4959]: I1007 14:43:17.809823 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:43:17 crc kubenswrapper[4959]: E1007 14:43:17.811084 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:43:32 crc kubenswrapper[4959]: I1007 14:43:32.809735 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:43:32 crc kubenswrapper[4959]: E1007 14:43:32.810747 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:43:46 crc kubenswrapper[4959]: I1007 14:43:46.808749 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:43:46 crc kubenswrapper[4959]: E1007 14:43:46.809940 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:44:01 crc kubenswrapper[4959]: I1007 14:44:01.809171 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:44:01 crc kubenswrapper[4959]: E1007 14:44:01.810190 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.145092 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:10 crc kubenswrapper[4959]: E1007 14:44:10.147170 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="extract-content" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.147191 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="extract-content" Oct 07 14:44:10 crc kubenswrapper[4959]: E1007 14:44:10.147204 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="extract-utilities" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.147212 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="extract-utilities" Oct 07 14:44:10 crc kubenswrapper[4959]: E1007 14:44:10.147235 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="registry-server" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.147243 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="registry-server" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.147466 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a135d7d8-db78-4a04-9364-989e60f1b2a7" containerName="registry-server" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.149310 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.158564 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.269025 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfzb\" (UniqueName: \"kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.269116 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.269296 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.372425 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfzb\" (UniqueName: \"kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.372514 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.372559 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.373465 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.373592 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.391961 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfzb\" (UniqueName: \"kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb\") pod \"community-operators-nfl5v\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:10 crc kubenswrapper[4959]: I1007 14:44:10.468141 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:11 crc kubenswrapper[4959]: I1007 14:44:11.062893 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:11 crc kubenswrapper[4959]: I1007 14:44:11.597648 4959 generic.go:334] "Generic (PLEG): container finished" podID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerID="cbc99fa6ebe34cb05b281bbc52acea328c5098c40b761b3292aac27f2a856091" exitCode=0 Oct 07 14:44:11 crc kubenswrapper[4959]: I1007 14:44:11.597686 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerDied","Data":"cbc99fa6ebe34cb05b281bbc52acea328c5098c40b761b3292aac27f2a856091"} Oct 07 14:44:11 crc kubenswrapper[4959]: I1007 14:44:11.597709 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerStarted","Data":"83a596531344bdf9781e712a566ae97d58560567e68b7682f20bc807c9a6944d"} Oct 07 14:44:11 crc kubenswrapper[4959]: I1007 14:44:11.600894 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:44:14 crc kubenswrapper[4959]: I1007 14:44:14.810648 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:44:15 crc kubenswrapper[4959]: I1007 14:44:15.637654 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a"} Oct 07 14:44:15 crc kubenswrapper[4959]: I1007 14:44:15.639685 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerStarted","Data":"3fad04095578940c67b073f33ea93cc808ecd3790fa93785c97df1c0a88a818d"} Oct 07 14:44:17 crc kubenswrapper[4959]: I1007 14:44:17.657846 4959 generic.go:334] "Generic (PLEG): container finished" podID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerID="3fad04095578940c67b073f33ea93cc808ecd3790fa93785c97df1c0a88a818d" exitCode=0 Oct 07 14:44:17 crc kubenswrapper[4959]: I1007 14:44:17.657942 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerDied","Data":"3fad04095578940c67b073f33ea93cc808ecd3790fa93785c97df1c0a88a818d"} Oct 07 14:44:18 crc kubenswrapper[4959]: I1007 14:44:18.669778 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerStarted","Data":"7967e4162a85d7d59f5b309ace874ba3f63da1737122e6c651bbc70f4337dcb6"} Oct 07 14:44:18 crc kubenswrapper[4959]: I1007 14:44:18.688931 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nfl5v" podStartSLOduration=2.200328213 podStartE2EDuration="8.688913077s" podCreationTimestamp="2025-10-07 14:44:10 +0000 UTC" firstStartedPulling="2025-10-07 14:44:11.600644405 +0000 UTC m=+6203.761367082" lastFinishedPulling="2025-10-07 14:44:18.089229269 +0000 UTC m=+6210.249951946" observedRunningTime="2025-10-07 14:44:18.683840711 +0000 UTC m=+6210.844563398" watchObservedRunningTime="2025-10-07 14:44:18.688913077 +0000 UTC m=+6210.849635754" Oct 07 14:44:20 crc kubenswrapper[4959]: I1007 14:44:20.468821 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:20 crc kubenswrapper[4959]: I1007 14:44:20.469398 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:21 crc kubenswrapper[4959]: I1007 14:44:21.515852 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nfl5v" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="registry-server" probeResult="failure" output=< Oct 07 14:44:21 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 14:44:21 crc kubenswrapper[4959]: > Oct 07 14:44:30 crc kubenswrapper[4959]: I1007 14:44:30.517402 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:30 crc kubenswrapper[4959]: I1007 14:44:30.569869 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:33 crc kubenswrapper[4959]: I1007 14:44:33.574384 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:33 crc kubenswrapper[4959]: I1007 14:44:33.576784 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nfl5v" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="registry-server" containerID="cri-o://7967e4162a85d7d59f5b309ace874ba3f63da1737122e6c651bbc70f4337dcb6" gracePeriod=2 Oct 07 14:44:33 crc kubenswrapper[4959]: I1007 14:44:33.812802 4959 generic.go:334] "Generic (PLEG): container finished" podID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerID="7967e4162a85d7d59f5b309ace874ba3f63da1737122e6c651bbc70f4337dcb6" exitCode=0 Oct 07 14:44:33 crc kubenswrapper[4959]: I1007 14:44:33.812874 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerDied","Data":"7967e4162a85d7d59f5b309ace874ba3f63da1737122e6c651bbc70f4337dcb6"} Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.686898 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.800293 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znfzb\" (UniqueName: \"kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb\") pod \"efa452d8-e34f-4dcb-a8a5-16143e44f093\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.800420 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities\") pod \"efa452d8-e34f-4dcb-a8a5-16143e44f093\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.800478 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content\") pod \"efa452d8-e34f-4dcb-a8a5-16143e44f093\" (UID: \"efa452d8-e34f-4dcb-a8a5-16143e44f093\") " Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.801541 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities" (OuterVolumeSpecName: "utilities") pod "efa452d8-e34f-4dcb-a8a5-16143e44f093" (UID: "efa452d8-e34f-4dcb-a8a5-16143e44f093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.802362 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.806841 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb" (OuterVolumeSpecName: "kube-api-access-znfzb") pod "efa452d8-e34f-4dcb-a8a5-16143e44f093" (UID: "efa452d8-e34f-4dcb-a8a5-16143e44f093"). InnerVolumeSpecName "kube-api-access-znfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.822512 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfl5v" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.851173 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfl5v" event={"ID":"efa452d8-e34f-4dcb-a8a5-16143e44f093","Type":"ContainerDied","Data":"83a596531344bdf9781e712a566ae97d58560567e68b7682f20bc807c9a6944d"} Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.851243 4959 scope.go:117] "RemoveContainer" containerID="7967e4162a85d7d59f5b309ace874ba3f63da1737122e6c651bbc70f4337dcb6" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.852012 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efa452d8-e34f-4dcb-a8a5-16143e44f093" (UID: "efa452d8-e34f-4dcb-a8a5-16143e44f093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.880166 4959 scope.go:117] "RemoveContainer" containerID="3fad04095578940c67b073f33ea93cc808ecd3790fa93785c97df1c0a88a818d" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.902807 4959 scope.go:117] "RemoveContainer" containerID="cbc99fa6ebe34cb05b281bbc52acea328c5098c40b761b3292aac27f2a856091" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.904030 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znfzb\" (UniqueName: \"kubernetes.io/projected/efa452d8-e34f-4dcb-a8a5-16143e44f093-kube-api-access-znfzb\") on node \"crc\" DevicePath \"\"" Oct 07 14:44:34 crc kubenswrapper[4959]: I1007 14:44:34.904079 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efa452d8-e34f-4dcb-a8a5-16143e44f093-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:44:35 crc kubenswrapper[4959]: I1007 14:44:35.155509 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:35 crc kubenswrapper[4959]: I1007 14:44:35.163465 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nfl5v"] Oct 07 14:44:36 crc kubenswrapper[4959]: I1007 14:44:36.819867 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" path="/var/lib/kubelet/pods/efa452d8-e34f-4dcb-a8a5-16143e44f093/volumes" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.175356 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn"] Oct 07 14:45:00 crc kubenswrapper[4959]: E1007 14:45:00.176411 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="registry-server" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.176428 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="registry-server" Oct 07 14:45:00 crc kubenswrapper[4959]: E1007 14:45:00.176465 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="extract-content" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.176472 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="extract-content" Oct 07 14:45:00 crc kubenswrapper[4959]: E1007 14:45:00.176491 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="extract-utilities" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.176499 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="extract-utilities" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.176688 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa452d8-e34f-4dcb-a8a5-16143e44f093" containerName="registry-server" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.177396 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.188543 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn"] Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.189108 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.189528 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.300554 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.301400 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7rs\" (UniqueName: \"kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.301433 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.403881 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7rs\" (UniqueName: \"kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.403965 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.404025 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.405474 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.415669 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.425613 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7rs\" (UniqueName: \"kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs\") pod \"collect-profiles-29330805-km6jn\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:00 crc kubenswrapper[4959]: I1007 14:45:00.520669 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:01 crc kubenswrapper[4959]: I1007 14:45:01.039677 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn"] Oct 07 14:45:01 crc kubenswrapper[4959]: W1007 14:45:01.042747 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c7f350_283b_4644_92cc_0d1546edfe88.slice/crio-4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03 WatchSource:0}: Error finding container 4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03: Status 404 returned error can't find the container with id 4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03 Oct 07 14:45:01 crc kubenswrapper[4959]: I1007 14:45:01.087359 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" event={"ID":"82c7f350-283b-4644-92cc-0d1546edfe88","Type":"ContainerStarted","Data":"4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03"} Oct 07 14:45:02 crc kubenswrapper[4959]: I1007 14:45:02.100562 4959 generic.go:334] "Generic (PLEG): container finished" podID="82c7f350-283b-4644-92cc-0d1546edfe88" containerID="32ee6979849600466f7bd44656735fa0866922eaab2f154fb0a5898721b12478" exitCode=0 Oct 07 14:45:02 crc kubenswrapper[4959]: I1007 14:45:02.100656 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" event={"ID":"82c7f350-283b-4644-92cc-0d1546edfe88","Type":"ContainerDied","Data":"32ee6979849600466f7bd44656735fa0866922eaab2f154fb0a5898721b12478"} Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.654837 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.806707 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7rs\" (UniqueName: \"kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs\") pod \"82c7f350-283b-4644-92cc-0d1546edfe88\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.806788 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume\") pod \"82c7f350-283b-4644-92cc-0d1546edfe88\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.806948 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume\") pod \"82c7f350-283b-4644-92cc-0d1546edfe88\" (UID: \"82c7f350-283b-4644-92cc-0d1546edfe88\") " Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.807714 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume" (OuterVolumeSpecName: "config-volume") pod "82c7f350-283b-4644-92cc-0d1546edfe88" (UID: "82c7f350-283b-4644-92cc-0d1546edfe88"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.826797 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82c7f350-283b-4644-92cc-0d1546edfe88" (UID: "82c7f350-283b-4644-92cc-0d1546edfe88"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.831107 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs" (OuterVolumeSpecName: "kube-api-access-6s7rs") pod "82c7f350-283b-4644-92cc-0d1546edfe88" (UID: "82c7f350-283b-4644-92cc-0d1546edfe88"). InnerVolumeSpecName "kube-api-access-6s7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.908888 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7rs\" (UniqueName: \"kubernetes.io/projected/82c7f350-283b-4644-92cc-0d1546edfe88-kube-api-access-6s7rs\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.908923 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82c7f350-283b-4644-92cc-0d1546edfe88-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:03 crc kubenswrapper[4959]: I1007 14:45:03.908937 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82c7f350-283b-4644-92cc-0d1546edfe88-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.124256 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" event={"ID":"82c7f350-283b-4644-92cc-0d1546edfe88","Type":"ContainerDied","Data":"4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03"} Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.124570 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4692497e60f0577de54ddd927d927deb67e88fce8778f4cbc8d9ccab1ab5bb03" Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.124410 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn" Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.743495 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78"] Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.751378 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-tdd78"] Oct 07 14:45:04 crc kubenswrapper[4959]: I1007 14:45:04.827933 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb9815f-4fbe-4bc0-8f97-c7c400851e18" path="/var/lib/kubelet/pods/6eb9815f-4fbe-4bc0-8f97-c7c400851e18/volumes" Oct 07 14:45:56 crc kubenswrapper[4959]: I1007 14:45:56.199321 4959 scope.go:117] "RemoveContainer" containerID="851070a6390417fe3f4c60db43dae2c6a48d72a57960c337763e3ed148d04855" Oct 07 14:46:37 crc kubenswrapper[4959]: I1007 14:46:37.695749 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:46:37 crc kubenswrapper[4959]: I1007 14:46:37.696468 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:47:07 crc kubenswrapper[4959]: I1007 14:47:07.696086 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:47:07 crc kubenswrapper[4959]: I1007 14:47:07.697504 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:47:37 crc kubenswrapper[4959]: I1007 14:47:37.695671 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:47:37 crc kubenswrapper[4959]: I1007 14:47:37.696539 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:47:37 crc kubenswrapper[4959]: I1007 14:47:37.696654 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:47:37 crc kubenswrapper[4959]: I1007 14:47:37.698262 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:47:37 crc kubenswrapper[4959]: I1007 14:47:37.698381 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a" gracePeriod=600 Oct 07 14:47:38 crc kubenswrapper[4959]: I1007 14:47:38.755902 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a" exitCode=0 Oct 07 14:47:38 crc kubenswrapper[4959]: I1007 14:47:38.756018 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a"} Oct 07 14:47:38 crc kubenswrapper[4959]: I1007 14:47:38.758139 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5"} Oct 07 14:47:38 crc kubenswrapper[4959]: I1007 14:47:38.758242 4959 scope.go:117] "RemoveContainer" containerID="4c67dd05df6e4e1ffa013cef95e7ed482c98b841daa50f2cbb71a4e218350cf7" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.176109 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:01 crc kubenswrapper[4959]: E1007 14:49:01.177359 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c7f350-283b-4644-92cc-0d1546edfe88" containerName="collect-profiles" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.177379 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c7f350-283b-4644-92cc-0d1546edfe88" containerName="collect-profiles" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.177668 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c7f350-283b-4644-92cc-0d1546edfe88" containerName="collect-profiles" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.179112 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.199126 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.308566 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.308822 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.308844 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh4cz\" (UniqueName: \"kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.411248 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.411366 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.411391 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh4cz\" (UniqueName: \"kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.412228 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.412485 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.434517 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh4cz\" (UniqueName: \"kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz\") pod \"redhat-marketplace-4h49j\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:01 crc kubenswrapper[4959]: I1007 14:49:01.509144 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:02 crc kubenswrapper[4959]: I1007 14:49:02.021708 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:02 crc kubenswrapper[4959]: I1007 14:49:02.579222 4959 generic.go:334] "Generic (PLEG): container finished" podID="d476ba71-a709-42a3-b3db-b95a999645ce" containerID="cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2" exitCode=0 Oct 07 14:49:02 crc kubenswrapper[4959]: I1007 14:49:02.579267 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerDied","Data":"cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2"} Oct 07 14:49:02 crc kubenswrapper[4959]: I1007 14:49:02.579292 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerStarted","Data":"e89d19d9cd431188d41de7c4dc0466fff2684116a191299e4ce1e5ce071bc772"} Oct 07 14:49:03 crc kubenswrapper[4959]: I1007 14:49:03.594584 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerStarted","Data":"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a"} Oct 07 14:49:04 crc kubenswrapper[4959]: I1007 14:49:04.608816 4959 generic.go:334] "Generic (PLEG): container finished" podID="d476ba71-a709-42a3-b3db-b95a999645ce" containerID="767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a" exitCode=0 Oct 07 14:49:04 crc kubenswrapper[4959]: I1007 14:49:04.608961 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerDied","Data":"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a"} Oct 07 14:49:05 crc kubenswrapper[4959]: I1007 14:49:05.620872 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerStarted","Data":"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237"} Oct 07 14:49:05 crc kubenswrapper[4959]: I1007 14:49:05.653409 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4h49j" podStartSLOduration=2.075117826 podStartE2EDuration="4.653386017s" podCreationTimestamp="2025-10-07 14:49:01 +0000 UTC" firstStartedPulling="2025-10-07 14:49:02.581481483 +0000 UTC m=+6494.742204160" lastFinishedPulling="2025-10-07 14:49:05.159749674 +0000 UTC m=+6497.320472351" observedRunningTime="2025-10-07 14:49:05.643756941 +0000 UTC m=+6497.804479638" watchObservedRunningTime="2025-10-07 14:49:05.653386017 +0000 UTC m=+6497.814108694" Oct 07 14:49:11 crc kubenswrapper[4959]: I1007 14:49:11.510191 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:11 crc kubenswrapper[4959]: I1007 14:49:11.510572 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:11 crc kubenswrapper[4959]: I1007 14:49:11.555571 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:11 crc kubenswrapper[4959]: I1007 14:49:11.720469 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:11 crc kubenswrapper[4959]: I1007 14:49:11.787651 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:13 crc kubenswrapper[4959]: I1007 14:49:13.698038 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4h49j" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="registry-server" containerID="cri-o://0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237" gracePeriod=2 Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.323175 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.429504 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh4cz\" (UniqueName: \"kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz\") pod \"d476ba71-a709-42a3-b3db-b95a999645ce\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.429720 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content\") pod \"d476ba71-a709-42a3-b3db-b95a999645ce\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.429840 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities\") pod \"d476ba71-a709-42a3-b3db-b95a999645ce\" (UID: \"d476ba71-a709-42a3-b3db-b95a999645ce\") " Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.430893 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities" (OuterVolumeSpecName: "utilities") pod "d476ba71-a709-42a3-b3db-b95a999645ce" (UID: "d476ba71-a709-42a3-b3db-b95a999645ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.435933 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz" (OuterVolumeSpecName: "kube-api-access-mh4cz") pod "d476ba71-a709-42a3-b3db-b95a999645ce" (UID: "d476ba71-a709-42a3-b3db-b95a999645ce"). InnerVolumeSpecName "kube-api-access-mh4cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.447224 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d476ba71-a709-42a3-b3db-b95a999645ce" (UID: "d476ba71-a709-42a3-b3db-b95a999645ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.533903 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.533965 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh4cz\" (UniqueName: \"kubernetes.io/projected/d476ba71-a709-42a3-b3db-b95a999645ce-kube-api-access-mh4cz\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.533990 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d476ba71-a709-42a3-b3db-b95a999645ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.719427 4959 generic.go:334] "Generic (PLEG): container finished" podID="d476ba71-a709-42a3-b3db-b95a999645ce" containerID="0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237" exitCode=0 Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.719550 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h49j" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.719555 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerDied","Data":"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237"} Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.720135 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h49j" event={"ID":"d476ba71-a709-42a3-b3db-b95a999645ce","Type":"ContainerDied","Data":"e89d19d9cd431188d41de7c4dc0466fff2684116a191299e4ce1e5ce071bc772"} Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.720177 4959 scope.go:117] "RemoveContainer" containerID="0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.758349 4959 scope.go:117] "RemoveContainer" containerID="767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.770692 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.784244 4959 scope.go:117] "RemoveContainer" containerID="cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.784745 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h49j"] Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.826019 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" path="/var/lib/kubelet/pods/d476ba71-a709-42a3-b3db-b95a999645ce/volumes" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.840919 4959 scope.go:117] "RemoveContainer" containerID="0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237" Oct 07 14:49:14 crc kubenswrapper[4959]: E1007 14:49:14.842255 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237\": container with ID starting with 0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237 not found: ID does not exist" containerID="0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.842321 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237"} err="failed to get container status \"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237\": rpc error: code = NotFound desc = could not find container \"0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237\": container with ID starting with 0cc5449663c7fec966dc330e03c22c4ba18831d2731c2d4bc2c1be9b210a0237 not found: ID does not exist" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.842356 4959 scope.go:117] "RemoveContainer" containerID="767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a" Oct 07 14:49:14 crc kubenswrapper[4959]: E1007 14:49:14.842900 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a\": container with ID starting with 767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a not found: ID does not exist" containerID="767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.842964 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a"} err="failed to get container status \"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a\": rpc error: code = NotFound desc = could not find container \"767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a\": container with ID starting with 767f6a9a6da723dcbf047e23eca08ef9de3d6614ad35da3af34052aa4212613a not found: ID does not exist" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.843010 4959 scope.go:117] "RemoveContainer" containerID="cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2" Oct 07 14:49:14 crc kubenswrapper[4959]: E1007 14:49:14.845599 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2\": container with ID starting with cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2 not found: ID does not exist" containerID="cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2" Oct 07 14:49:14 crc kubenswrapper[4959]: I1007 14:49:14.845744 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2"} err="failed to get container status \"cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2\": rpc error: code = NotFound desc = could not find container \"cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2\": container with ID starting with cf8064734c40f2a7689a9cdce2da9c2b1e52282c6a343e00ee63b27dceda26c2 not found: ID does not exist" Oct 07 14:50:07 crc kubenswrapper[4959]: I1007 14:50:07.696205 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:50:07 crc kubenswrapper[4959]: I1007 14:50:07.697018 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.249787 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:22 crc kubenswrapper[4959]: E1007 14:50:22.254318 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="registry-server" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.254344 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="registry-server" Oct 07 14:50:22 crc kubenswrapper[4959]: E1007 14:50:22.254356 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="extract-content" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.254362 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="extract-content" Oct 07 14:50:22 crc kubenswrapper[4959]: E1007 14:50:22.254374 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="extract-utilities" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.254382 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="extract-utilities" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.254600 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d476ba71-a709-42a3-b3db-b95a999645ce" containerName="registry-server" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.256438 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.282766 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.313050 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g4w\" (UniqueName: \"kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.313543 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.313718 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.415186 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.415319 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.415452 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g4w\" (UniqueName: \"kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.415864 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.415962 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.437130 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g4w\" (UniqueName: \"kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w\") pod \"redhat-operators-qxzg9\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:22 crc kubenswrapper[4959]: I1007 14:50:22.600151 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:23 crc kubenswrapper[4959]: I1007 14:50:23.062851 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:23 crc kubenswrapper[4959]: I1007 14:50:23.385160 4959 generic.go:334] "Generic (PLEG): container finished" podID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerID="c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1" exitCode=0 Oct 07 14:50:23 crc kubenswrapper[4959]: I1007 14:50:23.385273 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerDied","Data":"c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1"} Oct 07 14:50:23 crc kubenswrapper[4959]: I1007 14:50:23.385606 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerStarted","Data":"59239dffaead90ff48f188ba47691efd746c96a383289de6dc312ca9c2e67bb1"} Oct 07 14:50:23 crc kubenswrapper[4959]: I1007 14:50:23.387339 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:50:25 crc kubenswrapper[4959]: I1007 14:50:25.403147 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerStarted","Data":"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce"} Oct 07 14:50:27 crc kubenswrapper[4959]: I1007 14:50:27.420724 4959 generic.go:334] "Generic (PLEG): container finished" podID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerID="d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce" exitCode=0 Oct 07 14:50:27 crc kubenswrapper[4959]: I1007 14:50:27.420796 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerDied","Data":"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce"} Oct 07 14:50:28 crc kubenswrapper[4959]: I1007 14:50:28.430816 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerStarted","Data":"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af"} Oct 07 14:50:28 crc kubenswrapper[4959]: I1007 14:50:28.471676 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxzg9" podStartSLOduration=1.970667634 podStartE2EDuration="6.471650991s" podCreationTimestamp="2025-10-07 14:50:22 +0000 UTC" firstStartedPulling="2025-10-07 14:50:23.387042999 +0000 UTC m=+6575.547765666" lastFinishedPulling="2025-10-07 14:50:27.888026346 +0000 UTC m=+6580.048749023" observedRunningTime="2025-10-07 14:50:28.462366254 +0000 UTC m=+6580.623088961" watchObservedRunningTime="2025-10-07 14:50:28.471650991 +0000 UTC m=+6580.632373668" Oct 07 14:50:32 crc kubenswrapper[4959]: I1007 14:50:32.600488 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:32 crc kubenswrapper[4959]: I1007 14:50:32.601110 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:32 crc kubenswrapper[4959]: I1007 14:50:32.647989 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:33 crc kubenswrapper[4959]: I1007 14:50:33.521322 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:33 crc kubenswrapper[4959]: I1007 14:50:33.575754 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:35 crc kubenswrapper[4959]: I1007 14:50:35.487186 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxzg9" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="registry-server" containerID="cri-o://3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af" gracePeriod=2 Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.035430 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.133539 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content\") pod \"0ea89cc7-d94c-43bf-b51c-082625b33915\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.133620 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities\") pod \"0ea89cc7-d94c-43bf-b51c-082625b33915\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.133663 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5g4w\" (UniqueName: \"kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w\") pod \"0ea89cc7-d94c-43bf-b51c-082625b33915\" (UID: \"0ea89cc7-d94c-43bf-b51c-082625b33915\") " Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.134553 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities" (OuterVolumeSpecName: "utilities") pod "0ea89cc7-d94c-43bf-b51c-082625b33915" (UID: "0ea89cc7-d94c-43bf-b51c-082625b33915"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.141881 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w" (OuterVolumeSpecName: "kube-api-access-n5g4w") pod "0ea89cc7-d94c-43bf-b51c-082625b33915" (UID: "0ea89cc7-d94c-43bf-b51c-082625b33915"). InnerVolumeSpecName "kube-api-access-n5g4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.229939 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ea89cc7-d94c-43bf-b51c-082625b33915" (UID: "0ea89cc7-d94c-43bf-b51c-082625b33915"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.235973 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.236152 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea89cc7-d94c-43bf-b51c-082625b33915-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.236244 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5g4w\" (UniqueName: \"kubernetes.io/projected/0ea89cc7-d94c-43bf-b51c-082625b33915-kube-api-access-n5g4w\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.498985 4959 generic.go:334] "Generic (PLEG): container finished" podID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerID="3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af" exitCode=0 Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.499031 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerDied","Data":"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af"} Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.499068 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzg9" event={"ID":"0ea89cc7-d94c-43bf-b51c-082625b33915","Type":"ContainerDied","Data":"59239dffaead90ff48f188ba47691efd746c96a383289de6dc312ca9c2e67bb1"} Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.499090 4959 scope.go:117] "RemoveContainer" containerID="3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.500176 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzg9" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.519865 4959 scope.go:117] "RemoveContainer" containerID="d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.541681 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.550903 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxzg9"] Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.566811 4959 scope.go:117] "RemoveContainer" containerID="c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.593705 4959 scope.go:117] "RemoveContainer" containerID="3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af" Oct 07 14:50:36 crc kubenswrapper[4959]: E1007 14:50:36.594457 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af\": container with ID starting with 3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af not found: ID does not exist" containerID="3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.594503 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af"} err="failed to get container status \"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af\": rpc error: code = NotFound desc = could not find container \"3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af\": container with ID starting with 3f36a9f8c36a5c96c257dd668e5b0759a6ed6de706aa38cb784f767fc97cb5af not found: ID does not exist" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.594538 4959 scope.go:117] "RemoveContainer" containerID="d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce" Oct 07 14:50:36 crc kubenswrapper[4959]: E1007 14:50:36.595017 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce\": container with ID starting with d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce not found: ID does not exist" containerID="d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.595058 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce"} err="failed to get container status \"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce\": rpc error: code = NotFound desc = could not find container \"d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce\": container with ID starting with d14dd74bb57a787264925b5ae3c75f2dee096d1a3860c419a6185d7b6ee381ce not found: ID does not exist" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.595083 4959 scope.go:117] "RemoveContainer" containerID="c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1" Oct 07 14:50:36 crc kubenswrapper[4959]: E1007 14:50:36.595392 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1\": container with ID starting with c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1 not found: ID does not exist" containerID="c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.595420 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1"} err="failed to get container status \"c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1\": rpc error: code = NotFound desc = could not find container \"c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1\": container with ID starting with c04ca45e338bd2db839716f8445e84706c09f07ab96f216e2074f66d2ec4afe1 not found: ID does not exist" Oct 07 14:50:36 crc kubenswrapper[4959]: I1007 14:50:36.819796 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" path="/var/lib/kubelet/pods/0ea89cc7-d94c-43bf-b51c-082625b33915/volumes" Oct 07 14:50:37 crc kubenswrapper[4959]: I1007 14:50:37.695461 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:50:37 crc kubenswrapper[4959]: I1007 14:50:37.695530 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.364455 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:38 crc kubenswrapper[4959]: E1007 14:50:38.365462 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="extract-utilities" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.365805 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="extract-utilities" Oct 07 14:50:38 crc kubenswrapper[4959]: E1007 14:50:38.365818 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="extract-content" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.365825 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="extract-content" Oct 07 14:50:38 crc kubenswrapper[4959]: E1007 14:50:38.365847 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="registry-server" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.365855 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="registry-server" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.366081 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea89cc7-d94c-43bf-b51c-082625b33915" containerName="registry-server" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.368237 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.379982 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.483747 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.484122 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wt6g\" (UniqueName: \"kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.484300 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.586078 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.586123 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wt6g\" (UniqueName: \"kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.586186 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.586660 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.586705 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.614932 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wt6g\" (UniqueName: \"kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g\") pod \"certified-operators-nb6tc\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:38 crc kubenswrapper[4959]: I1007 14:50:38.708269 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:39 crc kubenswrapper[4959]: I1007 14:50:39.245238 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:39 crc kubenswrapper[4959]: I1007 14:50:39.533271 4959 generic.go:334] "Generic (PLEG): container finished" podID="f561537a-9e18-406d-910f-d75c86c27c5a" containerID="035289f236d6d3021f2850bd0d39e0186ea3d5d7f03c9e6303c8466d53c82995" exitCode=0 Oct 07 14:50:39 crc kubenswrapper[4959]: I1007 14:50:39.533434 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerDied","Data":"035289f236d6d3021f2850bd0d39e0186ea3d5d7f03c9e6303c8466d53c82995"} Oct 07 14:50:39 crc kubenswrapper[4959]: I1007 14:50:39.533804 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerStarted","Data":"1fac2a06cc2515fd2438df44437e6b06091763995913096f87276e912a85b719"} Oct 07 14:50:41 crc kubenswrapper[4959]: I1007 14:50:41.553999 4959 generic.go:334] "Generic (PLEG): container finished" podID="f561537a-9e18-406d-910f-d75c86c27c5a" containerID="8d4831f1db26c38be558df157489b5be80452d837ef3e57c8a07f6c050231642" exitCode=0 Oct 07 14:50:41 crc kubenswrapper[4959]: I1007 14:50:41.554050 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerDied","Data":"8d4831f1db26c38be558df157489b5be80452d837ef3e57c8a07f6c050231642"} Oct 07 14:50:43 crc kubenswrapper[4959]: I1007 14:50:43.576798 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerStarted","Data":"8477006e4f9aa76ca6774d373a50cd9dd324a9b7d7aa6d2d896e0f520f2faf96"} Oct 07 14:50:43 crc kubenswrapper[4959]: I1007 14:50:43.600212 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nb6tc" podStartSLOduration=2.7736788260000003 podStartE2EDuration="5.600191551s" podCreationTimestamp="2025-10-07 14:50:38 +0000 UTC" firstStartedPulling="2025-10-07 14:50:39.535045998 +0000 UTC m=+6591.695768675" lastFinishedPulling="2025-10-07 14:50:42.361558723 +0000 UTC m=+6594.522281400" observedRunningTime="2025-10-07 14:50:43.597467983 +0000 UTC m=+6595.758190650" watchObservedRunningTime="2025-10-07 14:50:43.600191551 +0000 UTC m=+6595.760914228" Oct 07 14:50:48 crc kubenswrapper[4959]: I1007 14:50:48.709269 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:48 crc kubenswrapper[4959]: I1007 14:50:48.710022 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:48 crc kubenswrapper[4959]: I1007 14:50:48.758635 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:49 crc kubenswrapper[4959]: I1007 14:50:49.686112 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:49 crc kubenswrapper[4959]: I1007 14:50:49.739268 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:51 crc kubenswrapper[4959]: I1007 14:50:51.645021 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nb6tc" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="registry-server" containerID="cri-o://8477006e4f9aa76ca6774d373a50cd9dd324a9b7d7aa6d2d896e0f520f2faf96" gracePeriod=2 Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.657349 4959 generic.go:334] "Generic (PLEG): container finished" podID="f561537a-9e18-406d-910f-d75c86c27c5a" containerID="8477006e4f9aa76ca6774d373a50cd9dd324a9b7d7aa6d2d896e0f520f2faf96" exitCode=0 Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.657437 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerDied","Data":"8477006e4f9aa76ca6774d373a50cd9dd324a9b7d7aa6d2d896e0f520f2faf96"} Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.770546 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.891188 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content\") pod \"f561537a-9e18-406d-910f-d75c86c27c5a\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.891647 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities\") pod \"f561537a-9e18-406d-910f-d75c86c27c5a\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.891883 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wt6g\" (UniqueName: \"kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g\") pod \"f561537a-9e18-406d-910f-d75c86c27c5a\" (UID: \"f561537a-9e18-406d-910f-d75c86c27c5a\") " Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.893058 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities" (OuterVolumeSpecName: "utilities") pod "f561537a-9e18-406d-910f-d75c86c27c5a" (UID: "f561537a-9e18-406d-910f-d75c86c27c5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.893554 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.898690 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g" (OuterVolumeSpecName: "kube-api-access-9wt6g") pod "f561537a-9e18-406d-910f-d75c86c27c5a" (UID: "f561537a-9e18-406d-910f-d75c86c27c5a"). InnerVolumeSpecName "kube-api-access-9wt6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.940068 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f561537a-9e18-406d-910f-d75c86c27c5a" (UID: "f561537a-9e18-406d-910f-d75c86c27c5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.996007 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wt6g\" (UniqueName: \"kubernetes.io/projected/f561537a-9e18-406d-910f-d75c86c27c5a-kube-api-access-9wt6g\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:52 crc kubenswrapper[4959]: I1007 14:50:52.996051 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f561537a-9e18-406d-910f-d75c86c27c5a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.672438 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nb6tc" event={"ID":"f561537a-9e18-406d-910f-d75c86c27c5a","Type":"ContainerDied","Data":"1fac2a06cc2515fd2438df44437e6b06091763995913096f87276e912a85b719"} Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.672495 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nb6tc" Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.672540 4959 scope.go:117] "RemoveContainer" containerID="8477006e4f9aa76ca6774d373a50cd9dd324a9b7d7aa6d2d896e0f520f2faf96" Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.724516 4959 scope.go:117] "RemoveContainer" containerID="8d4831f1db26c38be558df157489b5be80452d837ef3e57c8a07f6c050231642" Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.732953 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.745134 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nb6tc"] Oct 07 14:50:53 crc kubenswrapper[4959]: I1007 14:50:53.751365 4959 scope.go:117] "RemoveContainer" containerID="035289f236d6d3021f2850bd0d39e0186ea3d5d7f03c9e6303c8466d53c82995" Oct 07 14:50:54 crc kubenswrapper[4959]: I1007 14:50:54.818316 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" path="/var/lib/kubelet/pods/f561537a-9e18-406d-910f-d75c86c27c5a/volumes" Oct 07 14:51:07 crc kubenswrapper[4959]: I1007 14:51:07.695790 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:51:07 crc kubenswrapper[4959]: I1007 14:51:07.696429 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:51:07 crc kubenswrapper[4959]: I1007 14:51:07.696484 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:51:07 crc kubenswrapper[4959]: I1007 14:51:07.697472 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:51:07 crc kubenswrapper[4959]: I1007 14:51:07.697537 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" gracePeriod=600 Oct 07 14:51:07 crc kubenswrapper[4959]: E1007 14:51:07.872146 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:51:08 crc kubenswrapper[4959]: I1007 14:51:08.823661 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" exitCode=0 Oct 07 14:51:08 crc kubenswrapper[4959]: I1007 14:51:08.827424 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5"} Oct 07 14:51:08 crc kubenswrapper[4959]: I1007 14:51:08.827544 4959 scope.go:117] "RemoveContainer" containerID="9c05042d48720b75f72fc0cc073b4bc092363c73b2ed69ca357313a93942600a" Oct 07 14:51:08 crc kubenswrapper[4959]: I1007 14:51:08.828434 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:51:08 crc kubenswrapper[4959]: E1007 14:51:08.828915 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:51:19 crc kubenswrapper[4959]: I1007 14:51:19.809291 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:51:19 crc kubenswrapper[4959]: E1007 14:51:19.810646 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:51:31 crc kubenswrapper[4959]: I1007 14:51:31.809339 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:51:31 crc kubenswrapper[4959]: E1007 14:51:31.810609 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:51:42 crc kubenswrapper[4959]: I1007 14:51:42.809124 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:51:42 crc kubenswrapper[4959]: E1007 14:51:42.809808 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:51:54 crc kubenswrapper[4959]: I1007 14:51:54.810042 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:51:54 crc kubenswrapper[4959]: E1007 14:51:54.811424 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:52:08 crc kubenswrapper[4959]: I1007 14:52:08.817809 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:52:08 crc kubenswrapper[4959]: E1007 14:52:08.819241 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:52:21 crc kubenswrapper[4959]: I1007 14:52:21.809667 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:52:21 crc kubenswrapper[4959]: E1007 14:52:21.811710 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:52:33 crc kubenswrapper[4959]: I1007 14:52:33.808931 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:52:33 crc kubenswrapper[4959]: E1007 14:52:33.810176 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:52:48 crc kubenswrapper[4959]: I1007 14:52:48.818788 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:52:48 crc kubenswrapper[4959]: E1007 14:52:48.820846 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:53:02 crc kubenswrapper[4959]: I1007 14:53:02.808987 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:53:02 crc kubenswrapper[4959]: E1007 14:53:02.810128 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:53:16 crc kubenswrapper[4959]: I1007 14:53:16.809970 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:53:16 crc kubenswrapper[4959]: E1007 14:53:16.810881 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:53:27 crc kubenswrapper[4959]: I1007 14:53:27.809427 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:53:27 crc kubenswrapper[4959]: E1007 14:53:27.810728 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:53:38 crc kubenswrapper[4959]: I1007 14:53:38.819687 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:53:38 crc kubenswrapper[4959]: E1007 14:53:38.821042 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:53:49 crc kubenswrapper[4959]: I1007 14:53:49.809713 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:53:49 crc kubenswrapper[4959]: E1007 14:53:49.811018 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:54:00 crc kubenswrapper[4959]: I1007 14:54:00.808632 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:54:00 crc kubenswrapper[4959]: E1007 14:54:00.809595 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:54:14 crc kubenswrapper[4959]: I1007 14:54:14.810161 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:54:14 crc kubenswrapper[4959]: E1007 14:54:14.811310 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:54:28 crc kubenswrapper[4959]: I1007 14:54:28.821174 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:54:28 crc kubenswrapper[4959]: E1007 14:54:28.822473 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:54:42 crc kubenswrapper[4959]: I1007 14:54:42.809362 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:54:42 crc kubenswrapper[4959]: E1007 14:54:42.810538 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:54:54 crc kubenswrapper[4959]: I1007 14:54:54.809537 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:54:54 crc kubenswrapper[4959]: E1007 14:54:54.810744 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:55:08 crc kubenswrapper[4959]: I1007 14:55:08.815657 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:55:08 crc kubenswrapper[4959]: E1007 14:55:08.816552 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:55:19 crc kubenswrapper[4959]: I1007 14:55:19.809585 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:55:19 crc kubenswrapper[4959]: E1007 14:55:19.811055 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:55:30 crc kubenswrapper[4959]: I1007 14:55:30.809179 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:55:30 crc kubenswrapper[4959]: E1007 14:55:30.810115 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:55:44 crc kubenswrapper[4959]: I1007 14:55:44.809906 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:55:44 crc kubenswrapper[4959]: E1007 14:55:44.810896 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:55:57 crc kubenswrapper[4959]: I1007 14:55:57.809924 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:55:57 crc kubenswrapper[4959]: E1007 14:55:57.811106 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 14:56:11 crc kubenswrapper[4959]: I1007 14:56:11.808908 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 14:56:13 crc kubenswrapper[4959]: I1007 14:56:13.047064 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06"} Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.200583 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:42 crc kubenswrapper[4959]: E1007 14:56:42.201803 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="extract-content" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.201817 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="extract-content" Oct 07 14:56:42 crc kubenswrapper[4959]: E1007 14:56:42.201841 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="registry-server" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.201847 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="registry-server" Oct 07 14:56:42 crc kubenswrapper[4959]: E1007 14:56:42.201865 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="extract-utilities" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.201872 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="extract-utilities" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.202131 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f561537a-9e18-406d-910f-d75c86c27c5a" containerName="registry-server" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.203889 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.215509 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.333061 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6dvq\" (UniqueName: \"kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.333800 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.333979 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.435814 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.435938 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.436012 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6dvq\" (UniqueName: \"kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.436462 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.436649 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.466467 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6dvq\" (UniqueName: \"kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq\") pod \"community-operators-8gfkz\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:42 crc kubenswrapper[4959]: I1007 14:56:42.575673 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:43 crc kubenswrapper[4959]: I1007 14:56:43.168201 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:43 crc kubenswrapper[4959]: I1007 14:56:43.351411 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerStarted","Data":"b3c98094d939c4c621e60f4280d68155f9b06ae4db40fc12d3255917635bd446"} Oct 07 14:56:44 crc kubenswrapper[4959]: I1007 14:56:44.365350 4959 generic.go:334] "Generic (PLEG): container finished" podID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerID="fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad" exitCode=0 Oct 07 14:56:44 crc kubenswrapper[4959]: I1007 14:56:44.365421 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerDied","Data":"fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad"} Oct 07 14:56:44 crc kubenswrapper[4959]: I1007 14:56:44.368547 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:56:47 crc kubenswrapper[4959]: I1007 14:56:47.399762 4959 generic.go:334] "Generic (PLEG): container finished" podID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerID="7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635" exitCode=0 Oct 07 14:56:47 crc kubenswrapper[4959]: I1007 14:56:47.399880 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerDied","Data":"7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635"} Oct 07 14:56:48 crc kubenswrapper[4959]: I1007 14:56:48.417472 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerStarted","Data":"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee"} Oct 07 14:56:52 crc kubenswrapper[4959]: I1007 14:56:52.576265 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:52 crc kubenswrapper[4959]: I1007 14:56:52.577146 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:52 crc kubenswrapper[4959]: I1007 14:56:52.627972 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:52 crc kubenswrapper[4959]: I1007 14:56:52.649984 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8gfkz" podStartSLOduration=7.115923566 podStartE2EDuration="10.64995761s" podCreationTimestamp="2025-10-07 14:56:42 +0000 UTC" firstStartedPulling="2025-10-07 14:56:44.368269502 +0000 UTC m=+6956.528992179" lastFinishedPulling="2025-10-07 14:56:47.902303546 +0000 UTC m=+6960.063026223" observedRunningTime="2025-10-07 14:56:48.439378515 +0000 UTC m=+6960.600101202" watchObservedRunningTime="2025-10-07 14:56:52.64995761 +0000 UTC m=+6964.810680297" Oct 07 14:56:53 crc kubenswrapper[4959]: I1007 14:56:53.529767 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:53 crc kubenswrapper[4959]: I1007 14:56:53.588331 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:55 crc kubenswrapper[4959]: I1007 14:56:55.499979 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8gfkz" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="registry-server" containerID="cri-o://e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee" gracePeriod=2 Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.004846 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.081723 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities\") pod \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.081898 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content\") pod \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.082140 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6dvq\" (UniqueName: \"kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq\") pod \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\" (UID: \"e524efb0-1c5e-4f13-b092-7bb31c1f5311\") " Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.082980 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities" (OuterVolumeSpecName: "utilities") pod "e524efb0-1c5e-4f13-b092-7bb31c1f5311" (UID: "e524efb0-1c5e-4f13-b092-7bb31c1f5311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.091093 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq" (OuterVolumeSpecName: "kube-api-access-k6dvq") pod "e524efb0-1c5e-4f13-b092-7bb31c1f5311" (UID: "e524efb0-1c5e-4f13-b092-7bb31c1f5311"). InnerVolumeSpecName "kube-api-access-k6dvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.142585 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e524efb0-1c5e-4f13-b092-7bb31c1f5311" (UID: "e524efb0-1c5e-4f13-b092-7bb31c1f5311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.185569 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.185618 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e524efb0-1c5e-4f13-b092-7bb31c1f5311-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.185657 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6dvq\" (UniqueName: \"kubernetes.io/projected/e524efb0-1c5e-4f13-b092-7bb31c1f5311-kube-api-access-k6dvq\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.513190 4959 generic.go:334] "Generic (PLEG): container finished" podID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerID="e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee" exitCode=0 Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.513301 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerDied","Data":"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee"} Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.513732 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gfkz" event={"ID":"e524efb0-1c5e-4f13-b092-7bb31c1f5311","Type":"ContainerDied","Data":"b3c98094d939c4c621e60f4280d68155f9b06ae4db40fc12d3255917635bd446"} Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.513767 4959 scope.go:117] "RemoveContainer" containerID="e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.513334 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gfkz" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.535524 4959 scope.go:117] "RemoveContainer" containerID="7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.564721 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.567805 4959 scope.go:117] "RemoveContainer" containerID="fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.572177 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8gfkz"] Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.618406 4959 scope.go:117] "RemoveContainer" containerID="e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee" Oct 07 14:56:56 crc kubenswrapper[4959]: E1007 14:56:56.619227 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee\": container with ID starting with e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee not found: ID does not exist" containerID="e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.619285 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee"} err="failed to get container status \"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee\": rpc error: code = NotFound desc = could not find container \"e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee\": container with ID starting with e187d9d13fc45c07ac8759f39386740e2804245c8b59f5cc709f616c2f60bbee not found: ID does not exist" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.619328 4959 scope.go:117] "RemoveContainer" containerID="7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635" Oct 07 14:56:56 crc kubenswrapper[4959]: E1007 14:56:56.619719 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635\": container with ID starting with 7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635 not found: ID does not exist" containerID="7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.619776 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635"} err="failed to get container status \"7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635\": rpc error: code = NotFound desc = could not find container \"7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635\": container with ID starting with 7932bb7a0aeb3c2eddff90978de1b5c278bb7e0dc4fb7de9d1d9b930e3b9e635 not found: ID does not exist" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.619812 4959 scope.go:117] "RemoveContainer" containerID="fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad" Oct 07 14:56:56 crc kubenswrapper[4959]: E1007 14:56:56.620335 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad\": container with ID starting with fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad not found: ID does not exist" containerID="fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.620374 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad"} err="failed to get container status \"fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad\": rpc error: code = NotFound desc = could not find container \"fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad\": container with ID starting with fe333711dd01e4bfa673fbc9a48bf9776c011d4c5c3b2a9ad3c7b7e66ab0ffad not found: ID does not exist" Oct 07 14:56:56 crc kubenswrapper[4959]: I1007 14:56:56.822651 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" path="/var/lib/kubelet/pods/e524efb0-1c5e-4f13-b092-7bb31c1f5311/volumes" Oct 07 14:58:37 crc kubenswrapper[4959]: I1007 14:58:37.695472 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:58:37 crc kubenswrapper[4959]: I1007 14:58:37.696203 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:59:07 crc kubenswrapper[4959]: I1007 14:59:07.696152 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:59:07 crc kubenswrapper[4959]: I1007 14:59:07.696897 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:59:37 crc kubenswrapper[4959]: I1007 14:59:37.695692 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:59:37 crc kubenswrapper[4959]: I1007 14:59:37.696266 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:59:37 crc kubenswrapper[4959]: I1007 14:59:37.696313 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 14:59:37 crc kubenswrapper[4959]: I1007 14:59:37.697091 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:59:37 crc kubenswrapper[4959]: I1007 14:59:37.697149 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06" gracePeriod=600 Oct 07 14:59:38 crc kubenswrapper[4959]: I1007 14:59:38.140248 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06" exitCode=0 Oct 07 14:59:38 crc kubenswrapper[4959]: I1007 14:59:38.140347 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06"} Oct 07 14:59:38 crc kubenswrapper[4959]: I1007 14:59:38.140604 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348"} Oct 07 14:59:38 crc kubenswrapper[4959]: I1007 14:59:38.140639 4959 scope.go:117] "RemoveContainer" containerID="c404ea14315566ddcbc4d3eda640461a688db74629e6ab3b52d00db778d5b0f5" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.168168 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr"] Oct 07 15:00:00 crc kubenswrapper[4959]: E1007 15:00:00.169456 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.169472 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4959]: E1007 15:00:00.169481 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.169487 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4959]: E1007 15:00:00.169517 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.169523 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.169838 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="e524efb0-1c5e-4f13-b092-7bb31c1f5311" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.170608 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.180174 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr"] Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.186086 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.186443 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.269774 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.269864 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8cw\" (UniqueName: \"kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.269962 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.372424 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.372561 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb8cw\" (UniqueName: \"kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.372638 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.373959 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.380749 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.392982 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb8cw\" (UniqueName: \"kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw\") pod \"collect-profiles-29330820-b8nnr\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.501595 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:00 crc kubenswrapper[4959]: I1007 15:00:00.988798 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr"] Oct 07 15:00:01 crc kubenswrapper[4959]: I1007 15:00:01.367026 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" event={"ID":"a255ac3a-d0de-4460-982e-492b2b4a9bd7","Type":"ContainerStarted","Data":"249d9428de1755fb1cc95dfaf34296672bd6a069c00850980f55334839e414c1"} Oct 07 15:00:01 crc kubenswrapper[4959]: I1007 15:00:01.367468 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" event={"ID":"a255ac3a-d0de-4460-982e-492b2b4a9bd7","Type":"ContainerStarted","Data":"350e216e9979da1ed96c0e0a6726872287e92e939c7016b422900f4734e55789"} Oct 07 15:00:01 crc kubenswrapper[4959]: I1007 15:00:01.382229 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" podStartSLOduration=1.382209842 podStartE2EDuration="1.382209842s" podCreationTimestamp="2025-10-07 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:00:01.381863872 +0000 UTC m=+7153.542586549" watchObservedRunningTime="2025-10-07 15:00:01.382209842 +0000 UTC m=+7153.542932519" Oct 07 15:00:02 crc kubenswrapper[4959]: I1007 15:00:02.378607 4959 generic.go:334] "Generic (PLEG): container finished" podID="a255ac3a-d0de-4460-982e-492b2b4a9bd7" containerID="249d9428de1755fb1cc95dfaf34296672bd6a069c00850980f55334839e414c1" exitCode=0 Oct 07 15:00:02 crc kubenswrapper[4959]: I1007 15:00:02.378681 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" event={"ID":"a255ac3a-d0de-4460-982e-492b2b4a9bd7","Type":"ContainerDied","Data":"249d9428de1755fb1cc95dfaf34296672bd6a069c00850980f55334839e414c1"} Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.790755 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.846710 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume\") pod \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.846856 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume\") pod \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.846989 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb8cw\" (UniqueName: \"kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw\") pod \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\" (UID: \"a255ac3a-d0de-4460-982e-492b2b4a9bd7\") " Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.847920 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume" (OuterVolumeSpecName: "config-volume") pod "a255ac3a-d0de-4460-982e-492b2b4a9bd7" (UID: "a255ac3a-d0de-4460-982e-492b2b4a9bd7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.855207 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw" (OuterVolumeSpecName: "kube-api-access-tb8cw") pod "a255ac3a-d0de-4460-982e-492b2b4a9bd7" (UID: "a255ac3a-d0de-4460-982e-492b2b4a9bd7"). InnerVolumeSpecName "kube-api-access-tb8cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.855330 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a255ac3a-d0de-4460-982e-492b2b4a9bd7" (UID: "a255ac3a-d0de-4460-982e-492b2b4a9bd7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.949407 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a255ac3a-d0de-4460-982e-492b2b4a9bd7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.949452 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb8cw\" (UniqueName: \"kubernetes.io/projected/a255ac3a-d0de-4460-982e-492b2b4a9bd7-kube-api-access-tb8cw\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4959]: I1007 15:00:03.949468 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a255ac3a-d0de-4460-982e-492b2b4a9bd7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.400986 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" event={"ID":"a255ac3a-d0de-4460-982e-492b2b4a9bd7","Type":"ContainerDied","Data":"350e216e9979da1ed96c0e0a6726872287e92e939c7016b422900f4734e55789"} Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.401041 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-b8nnr" Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.401059 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350e216e9979da1ed96c0e0a6726872287e92e939c7016b422900f4734e55789" Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.451150 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw"] Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.459005 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-2f9jw"] Oct 07 15:00:04 crc kubenswrapper[4959]: I1007 15:00:04.822791 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1178c1d0-6adf-4cda-9dc7-927ca47f1659" path="/var/lib/kubelet/pods/1178c1d0-6adf-4cda-9dc7-927ca47f1659/volumes" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.041318 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:26 crc kubenswrapper[4959]: E1007 15:00:26.042371 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a255ac3a-d0de-4460-982e-492b2b4a9bd7" containerName="collect-profiles" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.042385 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a255ac3a-d0de-4460-982e-492b2b4a9bd7" containerName="collect-profiles" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.042593 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a255ac3a-d0de-4460-982e-492b2b4a9bd7" containerName="collect-profiles" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.046087 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.062004 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.157725 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.157836 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.158090 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9pz\" (UniqueName: \"kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.261784 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.261870 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.261954 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9pz\" (UniqueName: \"kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.262400 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.263141 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.284903 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9pz\" (UniqueName: \"kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz\") pod \"redhat-marketplace-trlw7\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.369513 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:26 crc kubenswrapper[4959]: I1007 15:00:26.848703 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:27 crc kubenswrapper[4959]: I1007 15:00:27.613920 4959 generic.go:334] "Generic (PLEG): container finished" podID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerID="6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a" exitCode=0 Oct 07 15:00:27 crc kubenswrapper[4959]: I1007 15:00:27.614283 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerDied","Data":"6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a"} Oct 07 15:00:27 crc kubenswrapper[4959]: I1007 15:00:27.614318 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerStarted","Data":"21bec9b7025798ae0b1a43124033dd7966af511ab8987da663b2984d254dad8d"} Oct 07 15:00:29 crc kubenswrapper[4959]: I1007 15:00:29.650069 4959 generic.go:334] "Generic (PLEG): container finished" podID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerID="7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238" exitCode=0 Oct 07 15:00:29 crc kubenswrapper[4959]: I1007 15:00:29.650334 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerDied","Data":"7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238"} Oct 07 15:00:30 crc kubenswrapper[4959]: I1007 15:00:30.669097 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerStarted","Data":"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18"} Oct 07 15:00:30 crc kubenswrapper[4959]: I1007 15:00:30.700187 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trlw7" podStartSLOduration=2.248645539 podStartE2EDuration="4.700150064s" podCreationTimestamp="2025-10-07 15:00:26 +0000 UTC" firstStartedPulling="2025-10-07 15:00:27.61644666 +0000 UTC m=+7179.777169347" lastFinishedPulling="2025-10-07 15:00:30.067951195 +0000 UTC m=+7182.228673872" observedRunningTime="2025-10-07 15:00:30.69271563 +0000 UTC m=+7182.853438347" watchObservedRunningTime="2025-10-07 15:00:30.700150064 +0000 UTC m=+7182.860872741" Oct 07 15:00:36 crc kubenswrapper[4959]: I1007 15:00:36.370653 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:36 crc kubenswrapper[4959]: I1007 15:00:36.371511 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:36 crc kubenswrapper[4959]: I1007 15:00:36.438446 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:36 crc kubenswrapper[4959]: I1007 15:00:36.823971 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:36 crc kubenswrapper[4959]: I1007 15:00:36.885373 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:38 crc kubenswrapper[4959]: I1007 15:00:38.781253 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-trlw7" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="registry-server" containerID="cri-o://af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18" gracePeriod=2 Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.301570 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.389041 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content\") pod \"a3c976b3-8567-4c3a-a486-60c24a24db11\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.389452 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd9pz\" (UniqueName: \"kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz\") pod \"a3c976b3-8567-4c3a-a486-60c24a24db11\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.389646 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities\") pod \"a3c976b3-8567-4c3a-a486-60c24a24db11\" (UID: \"a3c976b3-8567-4c3a-a486-60c24a24db11\") " Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.390616 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities" (OuterVolumeSpecName: "utilities") pod "a3c976b3-8567-4c3a-a486-60c24a24db11" (UID: "a3c976b3-8567-4c3a-a486-60c24a24db11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.398787 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz" (OuterVolumeSpecName: "kube-api-access-rd9pz") pod "a3c976b3-8567-4c3a-a486-60c24a24db11" (UID: "a3c976b3-8567-4c3a-a486-60c24a24db11"). InnerVolumeSpecName "kube-api-access-rd9pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.403801 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3c976b3-8567-4c3a-a486-60c24a24db11" (UID: "a3c976b3-8567-4c3a-a486-60c24a24db11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.492348 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd9pz\" (UniqueName: \"kubernetes.io/projected/a3c976b3-8567-4c3a-a486-60c24a24db11-kube-api-access-rd9pz\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.492593 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.492685 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c976b3-8567-4c3a-a486-60c24a24db11-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.797113 4959 generic.go:334] "Generic (PLEG): container finished" podID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerID="af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18" exitCode=0 Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.797171 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerDied","Data":"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18"} Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.797208 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trlw7" event={"ID":"a3c976b3-8567-4c3a-a486-60c24a24db11","Type":"ContainerDied","Data":"21bec9b7025798ae0b1a43124033dd7966af511ab8987da663b2984d254dad8d"} Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.797233 4959 scope.go:117] "RemoveContainer" containerID="af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.797292 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trlw7" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.851133 4959 scope.go:117] "RemoveContainer" containerID="7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.860582 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.872083 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-trlw7"] Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.889812 4959 scope.go:117] "RemoveContainer" containerID="6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.934411 4959 scope.go:117] "RemoveContainer" containerID="af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18" Oct 07 15:00:39 crc kubenswrapper[4959]: E1007 15:00:39.935053 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18\": container with ID starting with af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18 not found: ID does not exist" containerID="af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.935091 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18"} err="failed to get container status \"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18\": rpc error: code = NotFound desc = could not find container \"af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18\": container with ID starting with af91aedcb2676ce13d2ae179e8e231fd1dfba7fc8d9408db00ddd090d78ccf18 not found: ID does not exist" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.935119 4959 scope.go:117] "RemoveContainer" containerID="7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238" Oct 07 15:00:39 crc kubenswrapper[4959]: E1007 15:00:39.935680 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238\": container with ID starting with 7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238 not found: ID does not exist" containerID="7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.935709 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238"} err="failed to get container status \"7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238\": rpc error: code = NotFound desc = could not find container \"7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238\": container with ID starting with 7c34007f6d8ee77e211a763118a1b02f1fabb5001d44a74146b8c653b5714238 not found: ID does not exist" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.935727 4959 scope.go:117] "RemoveContainer" containerID="6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a" Oct 07 15:00:39 crc kubenswrapper[4959]: E1007 15:00:39.936036 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a\": container with ID starting with 6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a not found: ID does not exist" containerID="6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a" Oct 07 15:00:39 crc kubenswrapper[4959]: I1007 15:00:39.936057 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a"} err="failed to get container status \"6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a\": rpc error: code = NotFound desc = could not find container \"6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a\": container with ID starting with 6bbd1f233c81b063d5b44515db05230ae16ef38ec5b05d9c2d0cb502cd7e8e3a not found: ID does not exist" Oct 07 15:00:40 crc kubenswrapper[4959]: I1007 15:00:40.821091 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" path="/var/lib/kubelet/pods/a3c976b3-8567-4c3a-a486-60c24a24db11/volumes" Oct 07 15:00:56 crc kubenswrapper[4959]: I1007 15:00:56.653870 4959 scope.go:117] "RemoveContainer" containerID="63d4165986cf720d219fca4adc1de02bec19865d1c157e16050aa812ecc6d795" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.387870 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:00:58 crc kubenswrapper[4959]: E1007 15:00:58.389254 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="extract-content" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.389274 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="extract-content" Oct 07 15:00:58 crc kubenswrapper[4959]: E1007 15:00:58.389307 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="registry-server" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.389315 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="registry-server" Oct 07 15:00:58 crc kubenswrapper[4959]: E1007 15:00:58.389360 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="extract-utilities" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.389371 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="extract-utilities" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.389661 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c976b3-8567-4c3a-a486-60c24a24db11" containerName="registry-server" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.393039 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.398262 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.583874 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbf5\" (UniqueName: \"kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.584039 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.586019 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.688951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbf5\" (UniqueName: \"kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.689121 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.689316 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.690102 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.690109 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.710370 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbf5\" (UniqueName: \"kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5\") pod \"redhat-operators-l9zm6\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:58 crc kubenswrapper[4959]: I1007 15:00:58.718413 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:00:59 crc kubenswrapper[4959]: I1007 15:00:59.224325 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.013302 4959 generic.go:334] "Generic (PLEG): container finished" podID="8fb46878-6028-493f-89c5-923f2297f48d" containerID="e1ba70494f9b69ecd188667cb47b738ee6d98c45ee674821a97240f26eeb689b" exitCode=0 Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.013387 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerDied","Data":"e1ba70494f9b69ecd188667cb47b738ee6d98c45ee674821a97240f26eeb689b"} Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.013787 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerStarted","Data":"f8fb37548635839366c0d11d6f0052851e9e5f600ebcd8b6825312319d312ef2"} Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.149918 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330821-7g47d"] Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.151519 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.163355 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-7g47d"] Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.329680 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.330114 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.330180 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlqk\" (UniqueName: \"kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.330378 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.432451 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.432506 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.432598 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.432620 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlqk\" (UniqueName: \"kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.464749 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.471409 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlqk\" (UniqueName: \"kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.471528 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.471616 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle\") pod \"keystone-cron-29330821-7g47d\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.472182 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:00 crc kubenswrapper[4959]: I1007 15:01:00.951222 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-7g47d"] Oct 07 15:01:01 crc kubenswrapper[4959]: I1007 15:01:01.023423 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-7g47d" event={"ID":"95a09836-b1d0-4b20-8b66-13cadce981d6","Type":"ContainerStarted","Data":"99215e676946e718b48ae820823938ea0fd17036584dad30f1f85b2d241aff65"} Oct 07 15:01:02 crc kubenswrapper[4959]: I1007 15:01:02.036283 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerStarted","Data":"9154d19d5a513ac8ebd9d2c60a6e61a1e89621d9041252cea361a185f5289e5a"} Oct 07 15:01:02 crc kubenswrapper[4959]: I1007 15:01:02.038601 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-7g47d" event={"ID":"95a09836-b1d0-4b20-8b66-13cadce981d6","Type":"ContainerStarted","Data":"bf599200b3823ecf91fb25c4229f411b99f51db39b3e2cb983f424a9347b066b"} Oct 07 15:01:02 crc kubenswrapper[4959]: I1007 15:01:02.075420 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330821-7g47d" podStartSLOduration=2.07540229 podStartE2EDuration="2.07540229s" podCreationTimestamp="2025-10-07 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:01:02.067653108 +0000 UTC m=+7214.228375805" watchObservedRunningTime="2025-10-07 15:01:02.07540229 +0000 UTC m=+7214.236124967" Oct 07 15:01:03 crc kubenswrapper[4959]: I1007 15:01:03.050016 4959 generic.go:334] "Generic (PLEG): container finished" podID="8fb46878-6028-493f-89c5-923f2297f48d" containerID="9154d19d5a513ac8ebd9d2c60a6e61a1e89621d9041252cea361a185f5289e5a" exitCode=0 Oct 07 15:01:03 crc kubenswrapper[4959]: I1007 15:01:03.050219 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerDied","Data":"9154d19d5a513ac8ebd9d2c60a6e61a1e89621d9041252cea361a185f5289e5a"} Oct 07 15:01:06 crc kubenswrapper[4959]: I1007 15:01:06.080352 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerStarted","Data":"928aee2dfb9fecbd7184a3e1972a4c38181e42a73ead39cce6e7445032601a3a"} Oct 07 15:01:06 crc kubenswrapper[4959]: I1007 15:01:06.099084 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9zm6" podStartSLOduration=3.086862978 podStartE2EDuration="8.099066982s" podCreationTimestamp="2025-10-07 15:00:58 +0000 UTC" firstStartedPulling="2025-10-07 15:01:00.015289764 +0000 UTC m=+7212.176012441" lastFinishedPulling="2025-10-07 15:01:05.027493768 +0000 UTC m=+7217.188216445" observedRunningTime="2025-10-07 15:01:06.095924902 +0000 UTC m=+7218.256647589" watchObservedRunningTime="2025-10-07 15:01:06.099066982 +0000 UTC m=+7218.259789659" Oct 07 15:01:07 crc kubenswrapper[4959]: I1007 15:01:07.090713 4959 generic.go:334] "Generic (PLEG): container finished" podID="95a09836-b1d0-4b20-8b66-13cadce981d6" containerID="bf599200b3823ecf91fb25c4229f411b99f51db39b3e2cb983f424a9347b066b" exitCode=0 Oct 07 15:01:07 crc kubenswrapper[4959]: I1007 15:01:07.090816 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-7g47d" event={"ID":"95a09836-b1d0-4b20-8b66-13cadce981d6","Type":"ContainerDied","Data":"bf599200b3823ecf91fb25c4229f411b99f51db39b3e2cb983f424a9347b066b"} Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.509748 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.521401 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle\") pod \"95a09836-b1d0-4b20-8b66-13cadce981d6\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.521682 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data\") pod \"95a09836-b1d0-4b20-8b66-13cadce981d6\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.521735 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxlqk\" (UniqueName: \"kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk\") pod \"95a09836-b1d0-4b20-8b66-13cadce981d6\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.521768 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys\") pod \"95a09836-b1d0-4b20-8b66-13cadce981d6\" (UID: \"95a09836-b1d0-4b20-8b66-13cadce981d6\") " Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.529886 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk" (OuterVolumeSpecName: "kube-api-access-hxlqk") pod "95a09836-b1d0-4b20-8b66-13cadce981d6" (UID: "95a09836-b1d0-4b20-8b66-13cadce981d6"). InnerVolumeSpecName "kube-api-access-hxlqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.534000 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "95a09836-b1d0-4b20-8b66-13cadce981d6" (UID: "95a09836-b1d0-4b20-8b66-13cadce981d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.571109 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a09836-b1d0-4b20-8b66-13cadce981d6" (UID: "95a09836-b1d0-4b20-8b66-13cadce981d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.614361 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data" (OuterVolumeSpecName: "config-data") pod "95a09836-b1d0-4b20-8b66-13cadce981d6" (UID: "95a09836-b1d0-4b20-8b66-13cadce981d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.623729 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.623755 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxlqk\" (UniqueName: \"kubernetes.io/projected/95a09836-b1d0-4b20-8b66-13cadce981d6-kube-api-access-hxlqk\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.623766 4959 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.623774 4959 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a09836-b1d0-4b20-8b66-13cadce981d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.719252 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:08 crc kubenswrapper[4959]: I1007 15:01:08.719677 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:09 crc kubenswrapper[4959]: I1007 15:01:09.115255 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-7g47d" event={"ID":"95a09836-b1d0-4b20-8b66-13cadce981d6","Type":"ContainerDied","Data":"99215e676946e718b48ae820823938ea0fd17036584dad30f1f85b2d241aff65"} Oct 07 15:01:09 crc kubenswrapper[4959]: I1007 15:01:09.115340 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99215e676946e718b48ae820823938ea0fd17036584dad30f1f85b2d241aff65" Oct 07 15:01:09 crc kubenswrapper[4959]: I1007 15:01:09.115414 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-7g47d" Oct 07 15:01:09 crc kubenswrapper[4959]: I1007 15:01:09.776806 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9zm6" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="registry-server" probeResult="failure" output=< Oct 07 15:01:09 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:01:09 crc kubenswrapper[4959]: > Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.448587 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rx5f5"] Oct 07 15:01:16 crc kubenswrapper[4959]: E1007 15:01:16.450930 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a09836-b1d0-4b20-8b66-13cadce981d6" containerName="keystone-cron" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.450956 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a09836-b1d0-4b20-8b66-13cadce981d6" containerName="keystone-cron" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.451200 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a09836-b1d0-4b20-8b66-13cadce981d6" containerName="keystone-cron" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.454174 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.471670 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rx5f5"] Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.514906 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr64p\" (UniqueName: \"kubernetes.io/projected/bb339141-77d5-4681-bc7a-549f37140f3c-kube-api-access-cr64p\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.514941 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-catalog-content\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.515002 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-utilities\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.617511 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-catalog-content\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.617556 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr64p\" (UniqueName: \"kubernetes.io/projected/bb339141-77d5-4681-bc7a-549f37140f3c-kube-api-access-cr64p\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.617646 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-utilities\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.618004 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-catalog-content\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.618034 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb339141-77d5-4681-bc7a-549f37140f3c-utilities\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.643134 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr64p\" (UniqueName: \"kubernetes.io/projected/bb339141-77d5-4681-bc7a-549f37140f3c-kube-api-access-cr64p\") pod \"certified-operators-rx5f5\" (UID: \"bb339141-77d5-4681-bc7a-549f37140f3c\") " pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:16 crc kubenswrapper[4959]: I1007 15:01:16.783791 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:17 crc kubenswrapper[4959]: I1007 15:01:17.375549 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rx5f5"] Oct 07 15:01:18 crc kubenswrapper[4959]: I1007 15:01:18.205925 4959 generic.go:334] "Generic (PLEG): container finished" podID="bb339141-77d5-4681-bc7a-549f37140f3c" containerID="6926b6ad98b9e77305d6bb1cab68213565c7a204b2fd0553e82b19ef482b67cf" exitCode=0 Oct 07 15:01:18 crc kubenswrapper[4959]: I1007 15:01:18.205984 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx5f5" event={"ID":"bb339141-77d5-4681-bc7a-549f37140f3c","Type":"ContainerDied","Data":"6926b6ad98b9e77305d6bb1cab68213565c7a204b2fd0553e82b19ef482b67cf"} Oct 07 15:01:18 crc kubenswrapper[4959]: I1007 15:01:18.206317 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx5f5" event={"ID":"bb339141-77d5-4681-bc7a-549f37140f3c","Type":"ContainerStarted","Data":"0df12ed615a2d5ccd7f555583d9b7d5910f91d9f74a47771503d7610b5b17416"} Oct 07 15:01:18 crc kubenswrapper[4959]: I1007 15:01:18.777702 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:18 crc kubenswrapper[4959]: I1007 15:01:18.831284 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:21 crc kubenswrapper[4959]: I1007 15:01:21.239508 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:01:21 crc kubenswrapper[4959]: I1007 15:01:21.240500 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9zm6" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="registry-server" containerID="cri-o://928aee2dfb9fecbd7184a3e1972a4c38181e42a73ead39cce6e7445032601a3a" gracePeriod=2 Oct 07 15:01:22 crc kubenswrapper[4959]: I1007 15:01:22.264093 4959 generic.go:334] "Generic (PLEG): container finished" podID="8fb46878-6028-493f-89c5-923f2297f48d" containerID="928aee2dfb9fecbd7184a3e1972a4c38181e42a73ead39cce6e7445032601a3a" exitCode=0 Oct 07 15:01:22 crc kubenswrapper[4959]: I1007 15:01:22.264209 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerDied","Data":"928aee2dfb9fecbd7184a3e1972a4c38181e42a73ead39cce6e7445032601a3a"} Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.277025 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9zm6" event={"ID":"8fb46878-6028-493f-89c5-923f2297f48d","Type":"ContainerDied","Data":"f8fb37548635839366c0d11d6f0052851e9e5f600ebcd8b6825312319d312ef2"} Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.277486 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fb37548635839366c0d11d6f0052851e9e5f600ebcd8b6825312319d312ef2" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.279842 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx5f5" event={"ID":"bb339141-77d5-4681-bc7a-549f37140f3c","Type":"ContainerStarted","Data":"a11a5e9a92c53f13c2146012c54be1f5b59491c1ae56c78b8b9913cef86fc79e"} Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.330468 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.489244 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content\") pod \"8fb46878-6028-493f-89c5-923f2297f48d\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.489356 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbf5\" (UniqueName: \"kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5\") pod \"8fb46878-6028-493f-89c5-923f2297f48d\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.489384 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities\") pod \"8fb46878-6028-493f-89c5-923f2297f48d\" (UID: \"8fb46878-6028-493f-89c5-923f2297f48d\") " Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.490711 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities" (OuterVolumeSpecName: "utilities") pod "8fb46878-6028-493f-89c5-923f2297f48d" (UID: "8fb46878-6028-493f-89c5-923f2297f48d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.496555 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5" (OuterVolumeSpecName: "kube-api-access-cfbf5") pod "8fb46878-6028-493f-89c5-923f2297f48d" (UID: "8fb46878-6028-493f-89c5-923f2297f48d"). InnerVolumeSpecName "kube-api-access-cfbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.592513 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbf5\" (UniqueName: \"kubernetes.io/projected/8fb46878-6028-493f-89c5-923f2297f48d-kube-api-access-cfbf5\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.592557 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.594873 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fb46878-6028-493f-89c5-923f2297f48d" (UID: "8fb46878-6028-493f-89c5-923f2297f48d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:23 crc kubenswrapper[4959]: I1007 15:01:23.695653 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb46878-6028-493f-89c5-923f2297f48d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.291398 4959 generic.go:334] "Generic (PLEG): container finished" podID="bb339141-77d5-4681-bc7a-549f37140f3c" containerID="a11a5e9a92c53f13c2146012c54be1f5b59491c1ae56c78b8b9913cef86fc79e" exitCode=0 Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.291496 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx5f5" event={"ID":"bb339141-77d5-4681-bc7a-549f37140f3c","Type":"ContainerDied","Data":"a11a5e9a92c53f13c2146012c54be1f5b59491c1ae56c78b8b9913cef86fc79e"} Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.291846 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9zm6" Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.341356 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.350067 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9zm6"] Oct 07 15:01:24 crc kubenswrapper[4959]: I1007 15:01:24.826736 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb46878-6028-493f-89c5-923f2297f48d" path="/var/lib/kubelet/pods/8fb46878-6028-493f-89c5-923f2297f48d/volumes" Oct 07 15:01:26 crc kubenswrapper[4959]: I1007 15:01:26.310593 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx5f5" event={"ID":"bb339141-77d5-4681-bc7a-549f37140f3c","Type":"ContainerStarted","Data":"a59ba03abc3bdff35124caad1089acffbc0aad58c9929580af31de8023f30630"} Oct 07 15:01:26 crc kubenswrapper[4959]: I1007 15:01:26.331047 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rx5f5" podStartSLOduration=3.137971078 podStartE2EDuration="10.331026832s" podCreationTimestamp="2025-10-07 15:01:16 +0000 UTC" firstStartedPulling="2025-10-07 15:01:18.208421479 +0000 UTC m=+7230.369144156" lastFinishedPulling="2025-10-07 15:01:25.401477233 +0000 UTC m=+7237.562199910" observedRunningTime="2025-10-07 15:01:26.326681427 +0000 UTC m=+7238.487404114" watchObservedRunningTime="2025-10-07 15:01:26.331026832 +0000 UTC m=+7238.491749509" Oct 07 15:01:26 crc kubenswrapper[4959]: I1007 15:01:26.784595 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:26 crc kubenswrapper[4959]: I1007 15:01:26.785039 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:27 crc kubenswrapper[4959]: I1007 15:01:27.831078 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rx5f5" podUID="bb339141-77d5-4681-bc7a-549f37140f3c" containerName="registry-server" probeResult="failure" output=< Oct 07 15:01:27 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:01:27 crc kubenswrapper[4959]: > Oct 07 15:01:36 crc kubenswrapper[4959]: I1007 15:01:36.840695 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:36 crc kubenswrapper[4959]: I1007 15:01:36.902412 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rx5f5" Oct 07 15:01:36 crc kubenswrapper[4959]: I1007 15:01:36.979469 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rx5f5"] Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.084672 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.084940 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ml74p" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="registry-server" containerID="cri-o://5921b06abe7189b11590b982572efc26959577349eda6853bc99a729737653cc" gracePeriod=2 Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.453952 4959 generic.go:334] "Generic (PLEG): container finished" podID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerID="5921b06abe7189b11590b982572efc26959577349eda6853bc99a729737653cc" exitCode=0 Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.454580 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerDied","Data":"5921b06abe7189b11590b982572efc26959577349eda6853bc99a729737653cc"} Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.638420 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.677900 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cvxg\" (UniqueName: \"kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg\") pod \"60d3850b-97a0-44a3-959a-7bcfe6524a49\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.678032 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content\") pod \"60d3850b-97a0-44a3-959a-7bcfe6524a49\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.678053 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities\") pod \"60d3850b-97a0-44a3-959a-7bcfe6524a49\" (UID: \"60d3850b-97a0-44a3-959a-7bcfe6524a49\") " Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.678928 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities" (OuterVolumeSpecName: "utilities") pod "60d3850b-97a0-44a3-959a-7bcfe6524a49" (UID: "60d3850b-97a0-44a3-959a-7bcfe6524a49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.693805 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg" (OuterVolumeSpecName: "kube-api-access-2cvxg") pod "60d3850b-97a0-44a3-959a-7bcfe6524a49" (UID: "60d3850b-97a0-44a3-959a-7bcfe6524a49"). InnerVolumeSpecName "kube-api-access-2cvxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.749481 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60d3850b-97a0-44a3-959a-7bcfe6524a49" (UID: "60d3850b-97a0-44a3-959a-7bcfe6524a49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.780730 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cvxg\" (UniqueName: \"kubernetes.io/projected/60d3850b-97a0-44a3-959a-7bcfe6524a49-kube-api-access-2cvxg\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.780764 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:37 crc kubenswrapper[4959]: I1007 15:01:37.780774 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d3850b-97a0-44a3-959a-7bcfe6524a49-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.467134 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml74p" event={"ID":"60d3850b-97a0-44a3-959a-7bcfe6524a49","Type":"ContainerDied","Data":"43109ff57e0030e5059e71db4a5c392bdf2fd7b360b304086769be94e0f83472"} Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.467146 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml74p" Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.467670 4959 scope.go:117] "RemoveContainer" containerID="5921b06abe7189b11590b982572efc26959577349eda6853bc99a729737653cc" Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.509520 4959 scope.go:117] "RemoveContainer" containerID="1798afb9f148632e03763b4dc16f47a4957be131ece10e18b6745665409d54df" Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.511474 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.517354 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ml74p"] Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.550190 4959 scope.go:117] "RemoveContainer" containerID="6946fc76285b7bf8c649a2d656f94b4c754f3059074420363f25545a8372cb7f" Oct 07 15:01:38 crc kubenswrapper[4959]: I1007 15:01:38.820558 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" path="/var/lib/kubelet/pods/60d3850b-97a0-44a3-959a-7bcfe6524a49/volumes" Oct 07 15:02:07 crc kubenswrapper[4959]: I1007 15:02:07.695997 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:02:07 crc kubenswrapper[4959]: I1007 15:02:07.696758 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:02:37 crc kubenswrapper[4959]: I1007 15:02:37.695538 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:02:37 crc kubenswrapper[4959]: I1007 15:02:37.696438 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:03:07 crc kubenswrapper[4959]: I1007 15:03:07.696513 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:03:07 crc kubenswrapper[4959]: I1007 15:03:07.697400 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:03:07 crc kubenswrapper[4959]: I1007 15:03:07.697488 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 15:03:07 crc kubenswrapper[4959]: I1007 15:03:07.698861 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:03:07 crc kubenswrapper[4959]: I1007 15:03:07.698933 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" gracePeriod=600 Oct 07 15:03:07 crc kubenswrapper[4959]: E1007 15:03:07.851538 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:03:08 crc kubenswrapper[4959]: I1007 15:03:08.409360 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" exitCode=0 Oct 07 15:03:08 crc kubenswrapper[4959]: I1007 15:03:08.409430 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348"} Oct 07 15:03:08 crc kubenswrapper[4959]: I1007 15:03:08.409477 4959 scope.go:117] "RemoveContainer" containerID="11c7d93d66e90b7dccf9beeccc3894ee7bb0122d1c52abc03b21157b67415b06" Oct 07 15:03:08 crc kubenswrapper[4959]: I1007 15:03:08.410638 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:03:08 crc kubenswrapper[4959]: E1007 15:03:08.410976 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:03:20 crc kubenswrapper[4959]: I1007 15:03:20.809189 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:03:20 crc kubenswrapper[4959]: E1007 15:03:20.810287 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:03:35 crc kubenswrapper[4959]: I1007 15:03:35.809574 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:03:35 crc kubenswrapper[4959]: E1007 15:03:35.810352 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:03:39 crc kubenswrapper[4959]: I1007 15:03:39.723428 4959 generic.go:334] "Generic (PLEG): container finished" podID="b14b7636-6093-478a-945a-a512ef1935b4" containerID="23431d0f8d90db1970972714c99df72717598486185f6c6b3e977e8e2f948b9c" exitCode=0 Oct 07 15:03:39 crc kubenswrapper[4959]: I1007 15:03:39.723546 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"b14b7636-6093-478a-945a-a512ef1935b4","Type":"ContainerDied","Data":"23431d0f8d90db1970972714c99df72717598486185f6c6b3e977e8e2f948b9c"} Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.537549 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.601774 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.601903 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.601944 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.605125 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.605237 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.605303 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.606197 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.606262 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.606408 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.606483 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnqkc\" (UniqueName: \"kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc\") pod \"b14b7636-6093-478a-945a-a512ef1935b4\" (UID: \"b14b7636-6093-478a-945a-a512ef1935b4\") " Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.607309 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.607964 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.608019 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data" (OuterVolumeSpecName: "config-data") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.612483 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.612945 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="extract-content" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.612964 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="extract-content" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.612982 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14b7636-6093-478a-945a-a512ef1935b4" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.612988 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14b7636-6093-478a-945a-a512ef1935b4" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.613001 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613015 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.613028 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="extract-utilities" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613034 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="extract-utilities" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.613054 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="extract-utilities" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613063 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="extract-utilities" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.613071 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="extract-content" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613077 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="extract-content" Oct 07 15:03:41 crc kubenswrapper[4959]: E1007 15:03:41.613098 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613104 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613283 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d3850b-97a0-44a3-959a-7bcfe6524a49" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613295 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14b7636-6093-478a-945a-a512ef1935b4" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613310 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb46878-6028-493f-89c5-923f2297f48d" containerName="registry-server" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.613914 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.614085 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.614499 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.615970 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc" (OuterVolumeSpecName: "kube-api-access-mnqkc") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "kube-api-access-mnqkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.626361 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph" (OuterVolumeSpecName: "ceph") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.627287 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.629080 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.635206 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.659241 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.662723 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.675492 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.688984 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b14b7636-6093-478a-945a-a512ef1935b4" (UID: "b14b7636-6093-478a-945a-a512ef1935b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710176 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710242 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710302 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldsb\" (UniqueName: \"kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710326 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710433 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710456 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710543 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710582 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710606 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710645 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710711 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnqkc\" (UniqueName: \"kubernetes.io/projected/b14b7636-6093-478a-945a-a512ef1935b4-kube-api-access-mnqkc\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710727 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710738 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710749 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b14b7636-6093-478a-945a-a512ef1935b4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710760 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710770 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710782 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b14b7636-6093-478a-945a-a512ef1935b4-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.710791 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b14b7636-6093-478a-945a-a512ef1935b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.745579 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.747500 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"b14b7636-6093-478a-945a-a512ef1935b4","Type":"ContainerDied","Data":"7910e285370d0ef417a997332303da89f6eeeb5e48cc907ccdab1f4a656849f7"} Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.747551 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7910e285370d0ef417a997332303da89f6eeeb5e48cc907ccdab1f4a656849f7" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.747562 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813038 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813099 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813132 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813156 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813185 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813212 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gldsb\" (UniqueName: \"kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813229 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813300 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.813326 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.814060 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.814401 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.814847 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.815080 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.819041 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.819113 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.819716 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.821953 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:41 crc kubenswrapper[4959]: I1007 15:03:41.837603 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldsb\" (UniqueName: \"kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:42 crc kubenswrapper[4959]: I1007 15:03:42.077897 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:03:42 crc kubenswrapper[4959]: I1007 15:03:42.708074 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 07 15:03:42 crc kubenswrapper[4959]: I1007 15:03:42.758765 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8","Type":"ContainerStarted","Data":"aab27e5e87c256c3173f1a4c3be68f45fe94d41a197d5a2306d9f155e19249f7"} Oct 07 15:03:44 crc kubenswrapper[4959]: I1007 15:03:44.784031 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8","Type":"ContainerStarted","Data":"c56453db382d12c08df4192a0295a91db0f559ba6d2fa4933b24fa5db5756bac"} Oct 07 15:03:44 crc kubenswrapper[4959]: I1007 15:03:44.835842 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-test" podStartSLOduration=3.835814594 podStartE2EDuration="3.835814594s" podCreationTimestamp="2025-10-07 15:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:03:44.820655449 +0000 UTC m=+7376.981378166" watchObservedRunningTime="2025-10-07 15:03:44.835814594 +0000 UTC m=+7376.996537281" Oct 07 15:03:46 crc kubenswrapper[4959]: I1007 15:03:46.817082 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:03:46 crc kubenswrapper[4959]: E1007 15:03:46.818294 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:04:01 crc kubenswrapper[4959]: I1007 15:04:01.809231 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:04:01 crc kubenswrapper[4959]: E1007 15:04:01.810263 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:04:14 crc kubenswrapper[4959]: I1007 15:04:14.809523 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:04:14 crc kubenswrapper[4959]: E1007 15:04:14.810957 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:04:28 crc kubenswrapper[4959]: I1007 15:04:28.815446 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:04:28 crc kubenswrapper[4959]: E1007 15:04:28.816476 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:04:39 crc kubenswrapper[4959]: I1007 15:04:39.809329 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:04:39 crc kubenswrapper[4959]: E1007 15:04:39.810386 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:04:53 crc kubenswrapper[4959]: I1007 15:04:53.809266 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:04:53 crc kubenswrapper[4959]: E1007 15:04:53.810222 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:05:07 crc kubenswrapper[4959]: I1007 15:05:07.809408 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:05:07 crc kubenswrapper[4959]: E1007 15:05:07.812174 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:05:18 crc kubenswrapper[4959]: I1007 15:05:18.816265 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:05:18 crc kubenswrapper[4959]: E1007 15:05:18.818356 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:05:31 crc kubenswrapper[4959]: I1007 15:05:31.809302 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:05:31 crc kubenswrapper[4959]: E1007 15:05:31.810398 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:05:43 crc kubenswrapper[4959]: I1007 15:05:43.810206 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:05:43 crc kubenswrapper[4959]: E1007 15:05:43.811887 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:05:57 crc kubenswrapper[4959]: I1007 15:05:57.808489 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:05:57 crc kubenswrapper[4959]: E1007 15:05:57.809214 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:06:08 crc kubenswrapper[4959]: I1007 15:06:08.831454 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:06:08 crc kubenswrapper[4959]: E1007 15:06:08.832404 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:06:20 crc kubenswrapper[4959]: I1007 15:06:20.810322 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:06:20 crc kubenswrapper[4959]: E1007 15:06:20.811728 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:06:34 crc kubenswrapper[4959]: I1007 15:06:34.810364 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:06:34 crc kubenswrapper[4959]: E1007 15:06:34.811769 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:06:48 crc kubenswrapper[4959]: I1007 15:06:48.823734 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:06:48 crc kubenswrapper[4959]: E1007 15:06:48.825029 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:01 crc kubenswrapper[4959]: I1007 15:07:01.809428 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:07:01 crc kubenswrapper[4959]: E1007 15:07:01.810919 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.209213 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.212755 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.220672 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.409655 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.410120 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtxt\" (UniqueName: \"kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.410305 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.512108 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.512175 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtxt\" (UniqueName: \"kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.512299 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.512790 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.512798 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.540746 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtxt\" (UniqueName: \"kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt\") pod \"community-operators-ln6gw\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:09 crc kubenswrapper[4959]: I1007 15:07:09.833600 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:10 crc kubenswrapper[4959]: I1007 15:07:10.273416 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:10 crc kubenswrapper[4959]: I1007 15:07:10.920479 4959 generic.go:334] "Generic (PLEG): container finished" podID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerID="8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26" exitCode=0 Oct 07 15:07:10 crc kubenswrapper[4959]: I1007 15:07:10.920722 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerDied","Data":"8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26"} Oct 07 15:07:10 crc kubenswrapper[4959]: I1007 15:07:10.920961 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerStarted","Data":"2f2b05ee8fb6a8395d59746bdbf1990c70f2d44dba4e585066fac70173a3cf88"} Oct 07 15:07:10 crc kubenswrapper[4959]: I1007 15:07:10.924406 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:07:12 crc kubenswrapper[4959]: I1007 15:07:12.941431 4959 generic.go:334] "Generic (PLEG): container finished" podID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerID="6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319" exitCode=0 Oct 07 15:07:12 crc kubenswrapper[4959]: I1007 15:07:12.941525 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerDied","Data":"6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319"} Oct 07 15:07:14 crc kubenswrapper[4959]: I1007 15:07:14.809545 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:07:14 crc kubenswrapper[4959]: E1007 15:07:14.811175 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:14 crc kubenswrapper[4959]: I1007 15:07:14.973900 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerStarted","Data":"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203"} Oct 07 15:07:14 crc kubenswrapper[4959]: I1007 15:07:14.996749 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ln6gw" podStartSLOduration=3.064047222 podStartE2EDuration="5.996727508s" podCreationTimestamp="2025-10-07 15:07:09 +0000 UTC" firstStartedPulling="2025-10-07 15:07:10.924006857 +0000 UTC m=+7583.084729534" lastFinishedPulling="2025-10-07 15:07:13.856687143 +0000 UTC m=+7586.017409820" observedRunningTime="2025-10-07 15:07:14.995126262 +0000 UTC m=+7587.155848949" watchObservedRunningTime="2025-10-07 15:07:14.996727508 +0000 UTC m=+7587.157450185" Oct 07 15:07:19 crc kubenswrapper[4959]: I1007 15:07:19.833810 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:19 crc kubenswrapper[4959]: I1007 15:07:19.834513 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:19 crc kubenswrapper[4959]: I1007 15:07:19.883950 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:20 crc kubenswrapper[4959]: I1007 15:07:20.070802 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:20 crc kubenswrapper[4959]: I1007 15:07:20.128715 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.041704 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ln6gw" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="registry-server" containerID="cri-o://3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203" gracePeriod=2 Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.561586 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.729930 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities\") pod \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.730144 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content\") pod \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.730200 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dtxt\" (UniqueName: \"kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt\") pod \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\" (UID: \"d962b7f7-a985-4b6c-a854-88e1d1646ecb\") " Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.731323 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities" (OuterVolumeSpecName: "utilities") pod "d962b7f7-a985-4b6c-a854-88e1d1646ecb" (UID: "d962b7f7-a985-4b6c-a854-88e1d1646ecb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.732991 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.741704 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt" (OuterVolumeSpecName: "kube-api-access-5dtxt") pod "d962b7f7-a985-4b6c-a854-88e1d1646ecb" (UID: "d962b7f7-a985-4b6c-a854-88e1d1646ecb"). InnerVolumeSpecName "kube-api-access-5dtxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.796941 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d962b7f7-a985-4b6c-a854-88e1d1646ecb" (UID: "d962b7f7-a985-4b6c-a854-88e1d1646ecb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.835151 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d962b7f7-a985-4b6c-a854-88e1d1646ecb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:22 crc kubenswrapper[4959]: I1007 15:07:22.835531 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dtxt\" (UniqueName: \"kubernetes.io/projected/d962b7f7-a985-4b6c-a854-88e1d1646ecb-kube-api-access-5dtxt\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.056865 4959 generic.go:334] "Generic (PLEG): container finished" podID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerID="3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203" exitCode=0 Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.056916 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln6gw" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.056917 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerDied","Data":"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203"} Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.057019 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln6gw" event={"ID":"d962b7f7-a985-4b6c-a854-88e1d1646ecb","Type":"ContainerDied","Data":"2f2b05ee8fb6a8395d59746bdbf1990c70f2d44dba4e585066fac70173a3cf88"} Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.057040 4959 scope.go:117] "RemoveContainer" containerID="3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.088154 4959 scope.go:117] "RemoveContainer" containerID="6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.097964 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.109402 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ln6gw"] Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.133807 4959 scope.go:117] "RemoveContainer" containerID="8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.162254 4959 scope.go:117] "RemoveContainer" containerID="3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203" Oct 07 15:07:23 crc kubenswrapper[4959]: E1007 15:07:23.162818 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203\": container with ID starting with 3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203 not found: ID does not exist" containerID="3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.162920 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203"} err="failed to get container status \"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203\": rpc error: code = NotFound desc = could not find container \"3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203\": container with ID starting with 3f4d61d4be15a9f8d59c0c731f4edd6b8fd3880992728254b8fb76a36aecf203 not found: ID does not exist" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.163001 4959 scope.go:117] "RemoveContainer" containerID="6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319" Oct 07 15:07:23 crc kubenswrapper[4959]: E1007 15:07:23.167035 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319\": container with ID starting with 6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319 not found: ID does not exist" containerID="6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.167607 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319"} err="failed to get container status \"6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319\": rpc error: code = NotFound desc = could not find container \"6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319\": container with ID starting with 6add345d442cac9043d3d841d54de8a03658b6217feaddfa89a9b2487c7ef319 not found: ID does not exist" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.167682 4959 scope.go:117] "RemoveContainer" containerID="8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26" Oct 07 15:07:23 crc kubenswrapper[4959]: E1007 15:07:23.168128 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26\": container with ID starting with 8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26 not found: ID does not exist" containerID="8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26" Oct 07 15:07:23 crc kubenswrapper[4959]: I1007 15:07:23.168285 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26"} err="failed to get container status \"8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26\": rpc error: code = NotFound desc = could not find container \"8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26\": container with ID starting with 8d620ff119c84c8a92079a8c08df9c0f44c5f7d73b487968ac3968cd645e1c26 not found: ID does not exist" Oct 07 15:07:24 crc kubenswrapper[4959]: I1007 15:07:24.820599 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" path="/var/lib/kubelet/pods/d962b7f7-a985-4b6c-a854-88e1d1646ecb/volumes" Oct 07 15:07:27 crc kubenswrapper[4959]: I1007 15:07:27.810697 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:07:27 crc kubenswrapper[4959]: E1007 15:07:27.811960 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:38 crc kubenswrapper[4959]: I1007 15:07:38.821780 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:07:38 crc kubenswrapper[4959]: E1007 15:07:38.822496 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:49 crc kubenswrapper[4959]: I1007 15:07:49.810216 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:07:49 crc kubenswrapper[4959]: E1007 15:07:49.812036 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:07:56 crc kubenswrapper[4959]: I1007 15:07:56.854787 4959 scope.go:117] "RemoveContainer" containerID="928aee2dfb9fecbd7184a3e1972a4c38181e42a73ead39cce6e7445032601a3a" Oct 07 15:07:56 crc kubenswrapper[4959]: I1007 15:07:56.878620 4959 scope.go:117] "RemoveContainer" containerID="e1ba70494f9b69ecd188667cb47b738ee6d98c45ee674821a97240f26eeb689b" Oct 07 15:07:56 crc kubenswrapper[4959]: I1007 15:07:56.900716 4959 scope.go:117] "RemoveContainer" containerID="9154d19d5a513ac8ebd9d2c60a6e61a1e89621d9041252cea361a185f5289e5a" Oct 07 15:08:03 crc kubenswrapper[4959]: I1007 15:08:03.809346 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:08:03 crc kubenswrapper[4959]: E1007 15:08:03.811712 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:08:14 crc kubenswrapper[4959]: I1007 15:08:14.566397 4959 generic.go:334] "Generic (PLEG): container finished" podID="9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" containerID="c56453db382d12c08df4192a0295a91db0f559ba6d2fa4933b24fa5db5756bac" exitCode=0 Oct 07 15:08:14 crc kubenswrapper[4959]: I1007 15:08:14.566491 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8","Type":"ContainerDied","Data":"c56453db382d12c08df4192a0295a91db0f559ba6d2fa4933b24fa5db5756bac"} Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.076040 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.194938 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195034 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195073 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gldsb\" (UniqueName: \"kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195109 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195156 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195214 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195256 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195309 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195411 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.195433 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config\") pod \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\" (UID: \"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8\") " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.199923 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data" (OuterVolumeSpecName: "config-data") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.200437 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.203544 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.203977 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph" (OuterVolumeSpecName: "ceph") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.204046 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb" (OuterVolumeSpecName: "kube-api-access-gldsb") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "kube-api-access-gldsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.204665 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.242914 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.243676 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.245618 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.256953 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" (UID: "9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298298 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298343 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298356 4959 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298414 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298431 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298441 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298452 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298464 4959 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298478 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gldsb\" (UniqueName: \"kubernetes.io/projected/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-kube-api-access-gldsb\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.298491 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.322599 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.400949 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.591985 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8","Type":"ContainerDied","Data":"aab27e5e87c256c3173f1a4c3be68f45fe94d41a197d5a2306d9f155e19249f7"} Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.592046 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aab27e5e87c256c3173f1a4c3be68f45fe94d41a197d5a2306d9f155e19249f7" Oct 07 15:08:16 crc kubenswrapper[4959]: I1007 15:08:16.592173 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 07 15:08:18 crc kubenswrapper[4959]: I1007 15:08:18.832089 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.139237 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:08:19 crc kubenswrapper[4959]: E1007 15:08:19.140307 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="extract-utilities" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140331 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="extract-utilities" Oct 07 15:08:19 crc kubenswrapper[4959]: E1007 15:08:19.140356 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="extract-content" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140363 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="extract-content" Oct 07 15:08:19 crc kubenswrapper[4959]: E1007 15:08:19.140381 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140389 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:08:19 crc kubenswrapper[4959]: E1007 15:08:19.140427 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="registry-server" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140435 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="registry-server" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140656 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.140677 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="d962b7f7-a985-4b6c-a854-88e1d1646ecb" containerName="registry-server" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.142062 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.146109 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bm2d9" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.153244 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.267512 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2lf\" (UniqueName: \"kubernetes.io/projected/2611594a-b816-4cdc-b55b-d6ac6e281071-kube-api-access-rz2lf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.267646 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.369986 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2lf\" (UniqueName: \"kubernetes.io/projected/2611594a-b816-4cdc-b55b-d6ac6e281071-kube-api-access-rz2lf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.370069 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.370754 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.400501 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2lf\" (UniqueName: \"kubernetes.io/projected/2611594a-b816-4cdc-b55b-d6ac6e281071-kube-api-access-rz2lf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.402173 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2611594a-b816-4cdc-b55b-d6ac6e281071\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.474750 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.628267 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6"} Oct 07 15:08:19 crc kubenswrapper[4959]: I1007 15:08:19.986285 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:08:20 crc kubenswrapper[4959]: I1007 15:08:20.646194 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2611594a-b816-4cdc-b55b-d6ac6e281071","Type":"ContainerStarted","Data":"58b0ae64f36d118f8faddbe1ae63c63035680bf6c909d560ca7a7ba7fb8f82ac"} Oct 07 15:08:22 crc kubenswrapper[4959]: I1007 15:08:22.667683 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2611594a-b816-4cdc-b55b-d6ac6e281071","Type":"ContainerStarted","Data":"88ac104814c57e4e6d3e83a26fa1e33777527543cfe59e1d53398396967d0f34"} Oct 07 15:08:22 crc kubenswrapper[4959]: I1007 15:08:22.686709 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.928937492 podStartE2EDuration="3.686689462s" podCreationTimestamp="2025-10-07 15:08:19 +0000 UTC" firstStartedPulling="2025-10-07 15:08:19.999512291 +0000 UTC m=+7652.160234968" lastFinishedPulling="2025-10-07 15:08:21.757264261 +0000 UTC m=+7653.917986938" observedRunningTime="2025-10-07 15:08:22.683309345 +0000 UTC m=+7654.844032032" watchObservedRunningTime="2025-10-07 15:08:22.686689462 +0000 UTC m=+7654.847412149" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.859050 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.861945 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.867673 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-config" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.868130 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.868232 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-private-key" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.868314 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"tobiko-secret" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.868324 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-public-key" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.880072 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988442 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988553 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988589 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988694 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988782 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9ln\" (UniqueName: \"kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988809 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988868 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988886 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988902 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988964 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.988986 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:43 crc kubenswrapper[4959]: I1007 15:08:43.989004 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.090792 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9ln\" (UniqueName: \"kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.090863 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.090923 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.090946 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.090967 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091014 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091039 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091061 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091093 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091119 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091143 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091196 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091539 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.091761 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.092024 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.092363 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.092659 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.092884 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.094234 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.098120 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.106468 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.107116 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.107807 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.110517 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9ln\" (UniqueName: \"kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.131358 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.197177 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:08:44 crc kubenswrapper[4959]: W1007 15:08:44.766430 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d2c381_4cb1_4b35_b315_d1d4847f70c7.slice/crio-18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12 WatchSource:0}: Error finding container 18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12: Status 404 returned error can't find the container with id 18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12 Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.771741 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 07 15:08:44 crc kubenswrapper[4959]: I1007 15:08:44.910830 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"36d2c381-4cb1-4b35-b315-d1d4847f70c7","Type":"ContainerStarted","Data":"18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12"} Oct 07 15:09:04 crc kubenswrapper[4959]: E1007 15:09:04.528049 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tobiko:current-podified" Oct 07 15:09:04 crc kubenswrapper[4959]: E1007 15:09:04.529237 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tobiko-tests-tobiko,Image:quay.io/podified-antelope-centos9/openstack-tobiko:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TOBIKO_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:TOBIKO_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:TOBIKO_LOGS_DIR_NAME,Value:tobiko-tests-tobiko-s00-podified-functional,ValueFrom:nil,},EnvVar{Name:TOBIKO_PYTEST_ADDOPTS,Value:,ValueFrom:nil,},EnvVar{Name:TOBIKO_TESTENV,Value:functional -- tobiko/tests/functional/podified/test_topology.py,ValueFrom:nil,},EnvVar{Name:TOBIKO_VERSION,Value:master,ValueFrom:nil,},EnvVar{Name:TOX_NUM_PROCESSES,Value:2,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{8 0} {} 8 DecimalSI},memory: {{8589934592 0} {} BinarySI},},Requests:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tobiko,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tobiko/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/tobiko/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-config,ReadOnly:false,MountPath:/etc/tobiko/tobiko.conf,SubPath:tobiko.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-private-key,ReadOnly:true,MountPath:/etc/test_operator/id_ecdsa,SubPath:id_ecdsa,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-public-key,ReadOnly:true,MountPath:/etc/test_operator/id_ecdsa.pub,SubPath:id_ecdsa.pub,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kubeconfig,ReadOnly:true,MountPath:/var/lib/tobiko/.kube/config,SubPath:config,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fg9ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42495,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42495,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tobiko-tests-tobiko-s00-podified-functional_openstack(36d2c381-4cb1-4b35-b315-d1d4847f70c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 15:09:04 crc kubenswrapper[4959]: E1007 15:09:04.530498 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tobiko-tests-tobiko\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podUID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" Oct 07 15:09:05 crc kubenswrapper[4959]: E1007 15:09:05.182776 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tobiko-tests-tobiko\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tobiko:current-podified\\\"\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podUID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" Oct 07 15:09:19 crc kubenswrapper[4959]: I1007 15:09:19.335832 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"36d2c381-4cb1-4b35-b315-d1d4847f70c7","Type":"ContainerStarted","Data":"ba9557beba5aba82e95e637fb07c59dfb7cfbda2b681ab5bb17497b0eae7db74"} Oct 07 15:09:19 crc kubenswrapper[4959]: I1007 15:09:19.372241 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podStartSLOduration=3.646031882 podStartE2EDuration="37.372211835s" podCreationTimestamp="2025-10-07 15:08:42 +0000 UTC" firstStartedPulling="2025-10-07 15:08:44.772294079 +0000 UTC m=+7676.933016756" lastFinishedPulling="2025-10-07 15:09:18.498474032 +0000 UTC m=+7710.659196709" observedRunningTime="2025-10-07 15:09:19.358934744 +0000 UTC m=+7711.519657441" watchObservedRunningTime="2025-10-07 15:09:19.372211835 +0000 UTC m=+7711.532934512" Oct 07 15:10:37 crc kubenswrapper[4959]: I1007 15:10:37.183557 4959 generic.go:334] "Generic (PLEG): container finished" podID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" containerID="ba9557beba5aba82e95e637fb07c59dfb7cfbda2b681ab5bb17497b0eae7db74" exitCode=0 Oct 07 15:10:37 crc kubenswrapper[4959]: I1007 15:10:37.183676 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"36d2c381-4cb1-4b35-b315-d1d4847f70c7","Type":"ContainerDied","Data":"ba9557beba5aba82e95e637fb07c59dfb7cfbda2b681ab5bb17497b0eae7db74"} Oct 07 15:10:37 crc kubenswrapper[4959]: I1007 15:10:37.696232 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:10:37 crc kubenswrapper[4959]: I1007 15:10:37.696309 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.634885 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.705729 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 07 15:10:38 crc kubenswrapper[4959]: E1007 15:10:38.706350 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" containerName="tobiko-tests-tobiko" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.706366 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" containerName="tobiko-tests-tobiko" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.706829 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d2c381-4cb1-4b35-b315-d1d4847f70c7" containerName="tobiko-tests-tobiko" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.707908 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.726167 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.728018 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.728293 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.728426 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg9ln\" (UniqueName: \"kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.728797 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.728911 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729000 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729130 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729271 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729354 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729484 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729576 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.729694 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary\") pod \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\" (UID: \"36d2c381-4cb1-4b35-b315-d1d4847f70c7\") " Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.730254 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.730459 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.730577 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.730810 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.730945 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.731047 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.731165 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sj4g\" (UniqueName: \"kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.731308 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.731455 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.731723 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.732087 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.736464 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln" (OuterVolumeSpecName: "kube-api-access-fg9ln") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "kube-api-access-fg9ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.736964 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.741253 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.760524 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.775426 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph" (OuterVolumeSpecName: "ceph") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.778547 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.778906 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.781928 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.813481 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.817338 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839371 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839441 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839462 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839505 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sj4g\" (UniqueName: \"kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839544 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839656 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839907 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.839952 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840164 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840226 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840292 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840316 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840424 4959 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kubeconfig\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840442 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840452 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840466 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg9ln\" (UniqueName: \"kubernetes.io/projected/36d2c381-4cb1-4b35-b315-d1d4847f70c7-kube-api-access-fg9ln\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840476 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840485 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840493 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840503 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-tobiko-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840511 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36d2c381-4cb1-4b35-b315-d1d4847f70c7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840856 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.840918 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.841059 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.841308 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.843360 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.843366 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.843598 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.844933 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.846416 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.847107 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.849337 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.864456 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sj4g\" (UniqueName: \"kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.879261 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:38 crc kubenswrapper[4959]: I1007 15:10:38.943962 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:39 crc kubenswrapper[4959]: I1007 15:10:39.051975 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:10:39 crc kubenswrapper[4959]: I1007 15:10:39.209702 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"36d2c381-4cb1-4b35-b315-d1d4847f70c7","Type":"ContainerDied","Data":"18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12"} Oct 07 15:10:39 crc kubenswrapper[4959]: I1007 15:10:39.210064 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18af175d095689ef6687cead2ea5f76c8207fe76d73aeaa8e3d631a3b1444b12" Oct 07 15:10:39 crc kubenswrapper[4959]: I1007 15:10:39.209866 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 07 15:10:39 crc kubenswrapper[4959]: I1007 15:10:39.589347 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 07 15:10:40 crc kubenswrapper[4959]: I1007 15:10:40.229238 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"6d961e86-0037-4c2a-ac1f-b73c10339406","Type":"ContainerStarted","Data":"09d0e0a287aea088809452e4fb8642b797e4ebc6c3b86839b294571d24ce1c39"} Oct 07 15:10:40 crc kubenswrapper[4959]: I1007 15:10:40.231122 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"6d961e86-0037-4c2a-ac1f-b73c10339406","Type":"ContainerStarted","Data":"ff1b5a733fa099719b49332a0f252ac51c417d5a02069dcf0f74df6fcd297f44"} Oct 07 15:10:40 crc kubenswrapper[4959]: I1007 15:10:40.270069 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "36d2c381-4cb1-4b35-b315-d1d4847f70c7" (UID: "36d2c381-4cb1-4b35-b315-d1d4847f70c7"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:40 crc kubenswrapper[4959]: I1007 15:10:40.273135 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36d2c381-4cb1-4b35-b315-d1d4847f70c7-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:41 crc kubenswrapper[4959]: I1007 15:10:41.275226 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s01-sanity" podStartSLOduration=3.275197575 podStartE2EDuration="3.275197575s" podCreationTimestamp="2025-10-07 15:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:10:41.266959169 +0000 UTC m=+7793.427681856" watchObservedRunningTime="2025-10-07 15:10:41.275197575 +0000 UTC m=+7793.435920252" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.755014 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.757782 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.767854 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.872614 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.873042 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkqz2\" (UniqueName: \"kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.873376 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.976774 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.976349 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.977362 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.977487 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkqz2\" (UniqueName: \"kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:44 crc kubenswrapper[4959]: I1007 15:10:44.978141 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:45 crc kubenswrapper[4959]: I1007 15:10:45.005081 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkqz2\" (UniqueName: \"kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2\") pod \"redhat-marketplace-6lg56\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:45 crc kubenswrapper[4959]: I1007 15:10:45.090129 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:46 crc kubenswrapper[4959]: I1007 15:10:46.193341 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:10:46 crc kubenswrapper[4959]: I1007 15:10:46.286501 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerStarted","Data":"94d5f3af6ba9707adf850531cc2bf47a3e304a068801360af6ccf40a72bba51b"} Oct 07 15:10:47 crc kubenswrapper[4959]: I1007 15:10:47.301174 4959 generic.go:334] "Generic (PLEG): container finished" podID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerID="129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290" exitCode=0 Oct 07 15:10:47 crc kubenswrapper[4959]: I1007 15:10:47.301612 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerDied","Data":"129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290"} Oct 07 15:10:49 crc kubenswrapper[4959]: I1007 15:10:49.334280 4959 generic.go:334] "Generic (PLEG): container finished" podID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerID="90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9" exitCode=0 Oct 07 15:10:49 crc kubenswrapper[4959]: I1007 15:10:49.334398 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerDied","Data":"90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9"} Oct 07 15:10:51 crc kubenswrapper[4959]: I1007 15:10:51.356317 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerStarted","Data":"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1"} Oct 07 15:10:51 crc kubenswrapper[4959]: I1007 15:10:51.414274 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6lg56" podStartSLOduration=3.914827846 podStartE2EDuration="7.414252315s" podCreationTimestamp="2025-10-07 15:10:44 +0000 UTC" firstStartedPulling="2025-10-07 15:10:47.304494333 +0000 UTC m=+7799.465217020" lastFinishedPulling="2025-10-07 15:10:50.803918812 +0000 UTC m=+7802.964641489" observedRunningTime="2025-10-07 15:10:51.374101483 +0000 UTC m=+7803.534824160" watchObservedRunningTime="2025-10-07 15:10:51.414252315 +0000 UTC m=+7803.574974992" Oct 07 15:10:55 crc kubenswrapper[4959]: I1007 15:10:55.091262 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:55 crc kubenswrapper[4959]: I1007 15:10:55.092882 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:10:55 crc kubenswrapper[4959]: I1007 15:10:55.137553 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:11:05 crc kubenswrapper[4959]: I1007 15:11:05.135899 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:11:05 crc kubenswrapper[4959]: I1007 15:11:05.197536 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:11:05 crc kubenswrapper[4959]: I1007 15:11:05.487504 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6lg56" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="registry-server" containerID="cri-o://1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1" gracePeriod=2 Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.077867 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.244824 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkqz2\" (UniqueName: \"kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2\") pod \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.245121 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content\") pod \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.245165 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities\") pod \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\" (UID: \"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c\") " Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.246211 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities" (OuterVolumeSpecName: "utilities") pod "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" (UID: "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.251311 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2" (OuterVolumeSpecName: "kube-api-access-nkqz2") pod "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" (UID: "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c"). InnerVolumeSpecName "kube-api-access-nkqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.259039 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" (UID: "a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.348393 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.348442 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.348457 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkqz2\" (UniqueName: \"kubernetes.io/projected/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c-kube-api-access-nkqz2\") on node \"crc\" DevicePath \"\"" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.508848 4959 generic.go:334] "Generic (PLEG): container finished" podID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerID="1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1" exitCode=0 Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.508896 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerDied","Data":"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1"} Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.508922 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lg56" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.508936 4959 scope.go:117] "RemoveContainer" containerID="1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.508926 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lg56" event={"ID":"a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c","Type":"ContainerDied","Data":"94d5f3af6ba9707adf850531cc2bf47a3e304a068801360af6ccf40a72bba51b"} Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.540349 4959 scope.go:117] "RemoveContainer" containerID="90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.546071 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.554375 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lg56"] Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.560522 4959 scope.go:117] "RemoveContainer" containerID="129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.602442 4959 scope.go:117] "RemoveContainer" containerID="1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1" Oct 07 15:11:06 crc kubenswrapper[4959]: E1007 15:11:06.602831 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1\": container with ID starting with 1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1 not found: ID does not exist" containerID="1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.603346 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1"} err="failed to get container status \"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1\": rpc error: code = NotFound desc = could not find container \"1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1\": container with ID starting with 1300dec9873a6d3aeb64a9da0e7480280328bc37554d2dbafe8b52af8a8333c1 not found: ID does not exist" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.603460 4959 scope.go:117] "RemoveContainer" containerID="90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9" Oct 07 15:11:06 crc kubenswrapper[4959]: E1007 15:11:06.603911 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9\": container with ID starting with 90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9 not found: ID does not exist" containerID="90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.603962 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9"} err="failed to get container status \"90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9\": rpc error: code = NotFound desc = could not find container \"90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9\": container with ID starting with 90765133d5737114939386d855180ee3db113166e08179a882fab509bda753e9 not found: ID does not exist" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.604008 4959 scope.go:117] "RemoveContainer" containerID="129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290" Oct 07 15:11:06 crc kubenswrapper[4959]: E1007 15:11:06.604265 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290\": container with ID starting with 129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290 not found: ID does not exist" containerID="129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.604393 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290"} err="failed to get container status \"129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290\": rpc error: code = NotFound desc = could not find container \"129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290\": container with ID starting with 129fc2b6eb6a16bf69522e3393f64f715757b4ea244d01e45a07dea41bf76290 not found: ID does not exist" Oct 07 15:11:06 crc kubenswrapper[4959]: I1007 15:11:06.820089 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" path="/var/lib/kubelet/pods/a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c/volumes" Oct 07 15:11:07 crc kubenswrapper[4959]: I1007 15:11:07.696218 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:11:07 crc kubenswrapper[4959]: I1007 15:11:07.696289 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:11:37 crc kubenswrapper[4959]: I1007 15:11:37.695752 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:11:37 crc kubenswrapper[4959]: I1007 15:11:37.696430 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:11:37 crc kubenswrapper[4959]: I1007 15:11:37.696500 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 15:11:37 crc kubenswrapper[4959]: I1007 15:11:37.697426 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:11:37 crc kubenswrapper[4959]: I1007 15:11:37.697485 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6" gracePeriod=600 Oct 07 15:11:38 crc kubenswrapper[4959]: I1007 15:11:38.795686 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6" exitCode=0 Oct 07 15:11:38 crc kubenswrapper[4959]: I1007 15:11:38.795790 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6"} Oct 07 15:11:38 crc kubenswrapper[4959]: I1007 15:11:38.796210 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6"} Oct 07 15:11:38 crc kubenswrapper[4959]: I1007 15:11:38.796240 4959 scope.go:117] "RemoveContainer" containerID="367b932149357aed55a7d74f497bb2c074fd42c10792a51fc102bd7fc4946348" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.094610 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:11:55 crc kubenswrapper[4959]: E1007 15:11:55.095706 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="registry-server" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.095735 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="registry-server" Oct 07 15:11:55 crc kubenswrapper[4959]: E1007 15:11:55.095754 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="extract-utilities" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.095761 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="extract-utilities" Oct 07 15:11:55 crc kubenswrapper[4959]: E1007 15:11:55.095774 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="extract-content" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.095780 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="extract-content" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.096059 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3638c-2b7f-43c1-bf1c-c5bd02838b5c" containerName="registry-server" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.097578 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.131095 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.174039 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.174174 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c68x\" (UniqueName: \"kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.174290 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.275650 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.275763 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c68x\" (UniqueName: \"kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.275895 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.276214 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.276281 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.299012 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c68x\" (UniqueName: \"kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x\") pod \"redhat-operators-lbqfz\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.419379 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.941787 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:11:55 crc kubenswrapper[4959]: I1007 15:11:55.953272 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerStarted","Data":"54f786917d26ff2520a01136909c376c4c058e25c5e1524a0560a54df6a7c27e"} Oct 07 15:11:56 crc kubenswrapper[4959]: I1007 15:11:56.962016 4959 generic.go:334] "Generic (PLEG): container finished" podID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerID="8b48874f82a3101ab2f00562ab57f98fb9953361d8e7ad2a4e5881e1fffb7211" exitCode=0 Oct 07 15:11:56 crc kubenswrapper[4959]: I1007 15:11:56.962198 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerDied","Data":"8b48874f82a3101ab2f00562ab57f98fb9953361d8e7ad2a4e5881e1fffb7211"} Oct 07 15:11:58 crc kubenswrapper[4959]: I1007 15:11:58.993544 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerStarted","Data":"aa9951be147a16edab982de8de244a401e61e52a0c3ff5b2cd112cec1d555953"} Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.873142 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.875755 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.881541 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.882034 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpdnz\" (UniqueName: \"kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.882408 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.890260 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.985147 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.985293 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpdnz\" (UniqueName: \"kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.985365 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.986194 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:11:59 crc kubenswrapper[4959]: I1007 15:11:59.986306 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:00 crc kubenswrapper[4959]: I1007 15:12:00.017096 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpdnz\" (UniqueName: \"kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz\") pod \"certified-operators-m7l2k\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:00 crc kubenswrapper[4959]: I1007 15:12:00.200549 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:00 crc kubenswrapper[4959]: I1007 15:12:00.752282 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:12:01 crc kubenswrapper[4959]: I1007 15:12:01.024271 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerStarted","Data":"900c09fdc69ab8d22de3d39ca93129faac1b4e9072a53dbab6ae35cc53efbb46"} Oct 07 15:12:02 crc kubenswrapper[4959]: I1007 15:12:02.038379 4959 generic.go:334] "Generic (PLEG): container finished" podID="22125028-f963-4ed8-9fd7-6839719977c8" containerID="c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81" exitCode=0 Oct 07 15:12:02 crc kubenswrapper[4959]: I1007 15:12:02.038517 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerDied","Data":"c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81"} Oct 07 15:12:03 crc kubenswrapper[4959]: I1007 15:12:03.051019 4959 generic.go:334] "Generic (PLEG): container finished" podID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerID="aa9951be147a16edab982de8de244a401e61e52a0c3ff5b2cd112cec1d555953" exitCode=0 Oct 07 15:12:03 crc kubenswrapper[4959]: I1007 15:12:03.051196 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerDied","Data":"aa9951be147a16edab982de8de244a401e61e52a0c3ff5b2cd112cec1d555953"} Oct 07 15:12:04 crc kubenswrapper[4959]: I1007 15:12:04.061412 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerStarted","Data":"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4"} Oct 07 15:12:05 crc kubenswrapper[4959]: I1007 15:12:05.076566 4959 generic.go:334] "Generic (PLEG): container finished" podID="22125028-f963-4ed8-9fd7-6839719977c8" containerID="62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4" exitCode=0 Oct 07 15:12:05 crc kubenswrapper[4959]: I1007 15:12:05.076790 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerDied","Data":"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4"} Oct 07 15:12:06 crc kubenswrapper[4959]: I1007 15:12:06.088186 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerStarted","Data":"8251cd1b5d3f7bf2cf8e85ac95d29b474da2e9801bb27513b5585704dc25de35"} Oct 07 15:12:06 crc kubenswrapper[4959]: I1007 15:12:06.093705 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerStarted","Data":"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939"} Oct 07 15:12:06 crc kubenswrapper[4959]: I1007 15:12:06.113883 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lbqfz" podStartSLOduration=3.086208019 podStartE2EDuration="11.113743378s" podCreationTimestamp="2025-10-07 15:11:55 +0000 UTC" firstStartedPulling="2025-10-07 15:11:56.964834301 +0000 UTC m=+7869.125556968" lastFinishedPulling="2025-10-07 15:12:04.99236965 +0000 UTC m=+7877.153092327" observedRunningTime="2025-10-07 15:12:06.109207258 +0000 UTC m=+7878.269929935" watchObservedRunningTime="2025-10-07 15:12:06.113743378 +0000 UTC m=+7878.274466055" Oct 07 15:12:06 crc kubenswrapper[4959]: I1007 15:12:06.160563 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7l2k" podStartSLOduration=3.619621431 podStartE2EDuration="7.160545121s" podCreationTimestamp="2025-10-07 15:11:59 +0000 UTC" firstStartedPulling="2025-10-07 15:12:02.042546522 +0000 UTC m=+7874.203269199" lastFinishedPulling="2025-10-07 15:12:05.583470212 +0000 UTC m=+7877.744192889" observedRunningTime="2025-10-07 15:12:06.133609088 +0000 UTC m=+7878.294331765" watchObservedRunningTime="2025-10-07 15:12:06.160545121 +0000 UTC m=+7878.321267798" Oct 07 15:12:10 crc kubenswrapper[4959]: I1007 15:12:10.201865 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:10 crc kubenswrapper[4959]: I1007 15:12:10.203436 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:10 crc kubenswrapper[4959]: I1007 15:12:10.308611 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:11 crc kubenswrapper[4959]: I1007 15:12:11.195240 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:11 crc kubenswrapper[4959]: I1007 15:12:11.268331 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.156177 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m7l2k" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="registry-server" containerID="cri-o://d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939" gracePeriod=2 Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.664948 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.728288 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities\") pod \"22125028-f963-4ed8-9fd7-6839719977c8\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.728367 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content\") pod \"22125028-f963-4ed8-9fd7-6839719977c8\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.728656 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpdnz\" (UniqueName: \"kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz\") pod \"22125028-f963-4ed8-9fd7-6839719977c8\" (UID: \"22125028-f963-4ed8-9fd7-6839719977c8\") " Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.729740 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities" (OuterVolumeSpecName: "utilities") pod "22125028-f963-4ed8-9fd7-6839719977c8" (UID: "22125028-f963-4ed8-9fd7-6839719977c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.738054 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz" (OuterVolumeSpecName: "kube-api-access-qpdnz") pod "22125028-f963-4ed8-9fd7-6839719977c8" (UID: "22125028-f963-4ed8-9fd7-6839719977c8"). InnerVolumeSpecName "kube-api-access-qpdnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.783394 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22125028-f963-4ed8-9fd7-6839719977c8" (UID: "22125028-f963-4ed8-9fd7-6839719977c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.833368 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpdnz\" (UniqueName: \"kubernetes.io/projected/22125028-f963-4ed8-9fd7-6839719977c8-kube-api-access-qpdnz\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.833405 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:13 crc kubenswrapper[4959]: I1007 15:12:13.833415 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22125028-f963-4ed8-9fd7-6839719977c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.169126 4959 generic.go:334] "Generic (PLEG): container finished" podID="22125028-f963-4ed8-9fd7-6839719977c8" containerID="d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939" exitCode=0 Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.169181 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerDied","Data":"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939"} Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.169188 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7l2k" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.169215 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7l2k" event={"ID":"22125028-f963-4ed8-9fd7-6839719977c8","Type":"ContainerDied","Data":"900c09fdc69ab8d22de3d39ca93129faac1b4e9072a53dbab6ae35cc53efbb46"} Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.169236 4959 scope.go:117] "RemoveContainer" containerID="d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.196732 4959 scope.go:117] "RemoveContainer" containerID="62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.225185 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.235337 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m7l2k"] Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.255956 4959 scope.go:117] "RemoveContainer" containerID="c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.290316 4959 scope.go:117] "RemoveContainer" containerID="d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939" Oct 07 15:12:14 crc kubenswrapper[4959]: E1007 15:12:14.292326 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939\": container with ID starting with d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939 not found: ID does not exist" containerID="d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.292390 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939"} err="failed to get container status \"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939\": rpc error: code = NotFound desc = could not find container \"d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939\": container with ID starting with d5f28820204090369dcf9ece8d60230dc0103d43949de1711453218c3afeb939 not found: ID does not exist" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.292419 4959 scope.go:117] "RemoveContainer" containerID="62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4" Oct 07 15:12:14 crc kubenswrapper[4959]: E1007 15:12:14.292738 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4\": container with ID starting with 62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4 not found: ID does not exist" containerID="62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.292770 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4"} err="failed to get container status \"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4\": rpc error: code = NotFound desc = could not find container \"62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4\": container with ID starting with 62b260aa246ed958195899b852ea8a8d6ecacc10e4fbdeb9d28461facc50dcf4 not found: ID does not exist" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.292793 4959 scope.go:117] "RemoveContainer" containerID="c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81" Oct 07 15:12:14 crc kubenswrapper[4959]: E1007 15:12:14.294113 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81\": container with ID starting with c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81 not found: ID does not exist" containerID="c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.294145 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81"} err="failed to get container status \"c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81\": rpc error: code = NotFound desc = could not find container \"c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81\": container with ID starting with c86b10f18c6e4bd2cf3d87219b92ab03330230972c5b6bd116116b5a28a2db81 not found: ID does not exist" Oct 07 15:12:14 crc kubenswrapper[4959]: I1007 15:12:14.823359 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22125028-f963-4ed8-9fd7-6839719977c8" path="/var/lib/kubelet/pods/22125028-f963-4ed8-9fd7-6839719977c8/volumes" Oct 07 15:12:15 crc kubenswrapper[4959]: I1007 15:12:15.420116 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:15 crc kubenswrapper[4959]: I1007 15:12:15.420174 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:16 crc kubenswrapper[4959]: I1007 15:12:16.465170 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lbqfz" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="registry-server" probeResult="failure" output=< Oct 07 15:12:16 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:12:16 crc kubenswrapper[4959]: > Oct 07 15:12:25 crc kubenswrapper[4959]: I1007 15:12:25.275270 4959 generic.go:334] "Generic (PLEG): container finished" podID="6d961e86-0037-4c2a-ac1f-b73c10339406" containerID="09d0e0a287aea088809452e4fb8642b797e4ebc6c3b86839b294571d24ce1c39" exitCode=0 Oct 07 15:12:25 crc kubenswrapper[4959]: I1007 15:12:25.275355 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"6d961e86-0037-4c2a-ac1f-b73c10339406","Type":"ContainerDied","Data":"09d0e0a287aea088809452e4fb8642b797e4ebc6c3b86839b294571d24ce1c39"} Oct 07 15:12:25 crc kubenswrapper[4959]: I1007 15:12:25.471952 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:25 crc kubenswrapper[4959]: I1007 15:12:25.519172 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:25 crc kubenswrapper[4959]: I1007 15:12:25.706220 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.792976 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.941900 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943201 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943257 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943353 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943461 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943501 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943530 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sj4g\" (UniqueName: \"kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943578 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943609 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943657 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943692 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.943719 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs\") pod \"6d961e86-0037-4c2a-ac1f-b73c10339406\" (UID: \"6d961e86-0037-4c2a-ac1f-b73c10339406\") " Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.945458 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.948995 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g" (OuterVolumeSpecName: "kube-api-access-9sj4g") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "kube-api-access-9sj4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.949453 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.951061 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph" (OuterVolumeSpecName: "ceph") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.969790 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.972982 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.976000 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.978465 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:12:26 crc kubenswrapper[4959]: I1007 15:12:26.982761 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.001128 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.006899 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046308 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046348 4959 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-kubeconfig\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046360 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046371 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sj4g\" (UniqueName: \"kubernetes.io/projected/6d961e86-0037-4c2a-ac1f-b73c10339406-kube-api-access-9sj4g\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046381 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046390 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046398 4959 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/6d961e86-0037-4c2a-ac1f-b73c10339406-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046407 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046439 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046448 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.046457 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d961e86-0037-4c2a-ac1f-b73c10339406-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.070582 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.148657 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.298542 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"6d961e86-0037-4c2a-ac1f-b73c10339406","Type":"ContainerDied","Data":"ff1b5a733fa099719b49332a0f252ac51c417d5a02069dcf0f74df6fcd297f44"} Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.298585 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1b5a733fa099719b49332a0f252ac51c417d5a02069dcf0f74df6fcd297f44" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.298614 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 07 15:12:27 crc kubenswrapper[4959]: I1007 15:12:27.298743 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lbqfz" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="registry-server" containerID="cri-o://8251cd1b5d3f7bf2cf8e85ac95d29b474da2e9801bb27513b5585704dc25de35" gracePeriod=2 Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.314524 4959 generic.go:334] "Generic (PLEG): container finished" podID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerID="8251cd1b5d3f7bf2cf8e85ac95d29b474da2e9801bb27513b5585704dc25de35" exitCode=0 Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.315256 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerDied","Data":"8251cd1b5d3f7bf2cf8e85ac95d29b474da2e9801bb27513b5585704dc25de35"} Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.315294 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbqfz" event={"ID":"5ccd0bf6-2b13-4ccc-a594-a614ce508b27","Type":"ContainerDied","Data":"54f786917d26ff2520a01136909c376c4c058e25c5e1524a0560a54df6a7c27e"} Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.315309 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f786917d26ff2520a01136909c376c4c058e25c5e1524a0560a54df6a7c27e" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.387820 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.481992 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content\") pod \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.482307 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c68x\" (UniqueName: \"kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x\") pod \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.483354 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities\") pod \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\" (UID: \"5ccd0bf6-2b13-4ccc-a594-a614ce508b27\") " Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.485006 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities" (OuterVolumeSpecName: "utilities") pod "5ccd0bf6-2b13-4ccc-a594-a614ce508b27" (UID: "5ccd0bf6-2b13-4ccc-a594-a614ce508b27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.497023 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x" (OuterVolumeSpecName: "kube-api-access-8c68x") pod "5ccd0bf6-2b13-4ccc-a594-a614ce508b27" (UID: "5ccd0bf6-2b13-4ccc-a594-a614ce508b27"). InnerVolumeSpecName "kube-api-access-8c68x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.533734 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6d961e86-0037-4c2a-ac1f-b73c10339406" (UID: "6d961e86-0037-4c2a-ac1f-b73c10339406"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.585757 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c68x\" (UniqueName: \"kubernetes.io/projected/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-kube-api-access-8c68x\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.585793 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.585804 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6d961e86-0037-4c2a-ac1f-b73c10339406-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.589715 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ccd0bf6-2b13-4ccc-a594-a614ce508b27" (UID: "5ccd0bf6-2b13-4ccc-a594-a614ce508b27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:12:28 crc kubenswrapper[4959]: I1007 15:12:28.686647 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd0bf6-2b13-4ccc-a594-a614ce508b27-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:29 crc kubenswrapper[4959]: I1007 15:12:29.323986 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbqfz" Oct 07 15:12:29 crc kubenswrapper[4959]: I1007 15:12:29.346083 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:12:29 crc kubenswrapper[4959]: I1007 15:12:29.354268 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lbqfz"] Oct 07 15:12:30 crc kubenswrapper[4959]: I1007 15:12:30.819383 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" path="/var/lib/kubelet/pods/5ccd0bf6-2b13-4ccc-a594-a614ce508b27/volumes" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.513249 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514199 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="extract-content" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514219 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="extract-content" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514250 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514260 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514278 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="extract-utilities" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514286 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="extract-utilities" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514309 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="extract-utilities" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514318 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="extract-utilities" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514342 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514351 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514372 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="extract-content" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514381 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="extract-content" Oct 07 15:12:32 crc kubenswrapper[4959]: E1007 15:12:32.514399 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d961e86-0037-4c2a-ac1f-b73c10339406" containerName="tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514407 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d961e86-0037-4c2a-ac1f-b73c10339406" containerName="tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514708 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d961e86-0037-4c2a-ac1f-b73c10339406" containerName="tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514724 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ccd0bf6-2b13-4ccc-a594-a614ce508b27" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.514751 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="22125028-f963-4ed8-9fd7-6839719977c8" containerName="registry-server" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.515726 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.531151 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.665615 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnwd\" (UniqueName: \"kubernetes.io/projected/a85e3cec-d699-4a9f-9da3-809799b06f1c-kube-api-access-txnwd\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.665811 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.768385 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnwd\" (UniqueName: \"kubernetes.io/projected/a85e3cec-d699-4a9f-9da3-809799b06f1c-kube-api-access-txnwd\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.768485 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.768824 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.792897 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnwd\" (UniqueName: \"kubernetes.io/projected/a85e3cec-d699-4a9f-9da3-809799b06f1c-kube-api-access-txnwd\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.802324 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a85e3cec-d699-4a9f-9da3-809799b06f1c\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:32 crc kubenswrapper[4959]: I1007 15:12:32.845403 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 07 15:12:33 crc kubenswrapper[4959]: I1007 15:12:33.309151 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 07 15:12:33 crc kubenswrapper[4959]: I1007 15:12:33.313051 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:12:33 crc kubenswrapper[4959]: I1007 15:12:33.358699 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"a85e3cec-d699-4a9f-9da3-809799b06f1c","Type":"ContainerStarted","Data":"1b5a28282cb4e7229b5d617b94e19a5551cc5164523cd3d9385f2dda3f5ab267"} Oct 07 15:12:36 crc kubenswrapper[4959]: I1007 15:12:36.395068 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"a85e3cec-d699-4a9f-9da3-809799b06f1c","Type":"ContainerStarted","Data":"13021a215c760550c3ef1e76108468b7e862e334f9e450095ea45e86d1c32f74"} Oct 07 15:12:36 crc kubenswrapper[4959]: I1007 15:12:36.412892 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" podStartSLOduration=2.659765753 podStartE2EDuration="4.41287237s" podCreationTimestamp="2025-10-07 15:12:32 +0000 UTC" firstStartedPulling="2025-10-07 15:12:33.312857552 +0000 UTC m=+7905.473580229" lastFinishedPulling="2025-10-07 15:12:35.065964129 +0000 UTC m=+7907.226686846" observedRunningTime="2025-10-07 15:12:36.409925575 +0000 UTC m=+7908.570648262" watchObservedRunningTime="2025-10-07 15:12:36.41287237 +0000 UTC m=+7908.573595047" Oct 07 15:12:52 crc kubenswrapper[4959]: I1007 15:12:52.951362 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 07 15:12:52 crc kubenswrapper[4959]: I1007 15:12:52.953421 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 07 15:12:52 crc kubenswrapper[4959]: I1007 15:12:52.956606 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 15:12:52 crc kubenswrapper[4959]: I1007 15:12:52.956736 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 07 15:12:52 crc kubenswrapper[4959]: I1007 15:12:52.963408 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.066799 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067051 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7c8\" (UniqueName: \"kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067322 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067392 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067437 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067744 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.067943 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.068017 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.068186 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.068407 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172026 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172514 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172586 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7c8\" (UniqueName: \"kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172670 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172704 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172723 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172796 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172852 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172873 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.172920 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.173537 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.173999 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.174034 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.174811 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.181395 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.181530 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.181529 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.183837 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.187379 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.190240 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7c8\" (UniqueName: \"kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.208117 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.294962 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 07 15:12:53 crc kubenswrapper[4959]: I1007 15:12:53.792151 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 07 15:12:54 crc kubenswrapper[4959]: I1007 15:12:54.597641 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"9a227eb5-2c22-41c7-a0d8-a35d821c46e6","Type":"ContainerStarted","Data":"add62b8a4e1957f89c2a1bded99889d57b66d95664a0dd0d1a0a31c31bb3abb8"} Oct 07 15:13:09 crc kubenswrapper[4959]: E1007 15:13:09.926850 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Oct 07 15:13:09 crc kubenswrapper[4959]: E1007 15:13:09.927971 4959 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 15:13:09 crc kubenswrapper[4959]: container &Container{Name:ansibletest-ansibletest,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:-e manual_run=false,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:--- Oct 07 15:13:09 crc kubenswrapper[4959]: foo: bar Oct 07 15:13:09 crc kubenswrapper[4959]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:https://github.com/ansible/test-playbooks,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Oct 07 15:13:09 crc kubenswrapper[4959]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:./debug.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:false,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg7c8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansibletest-ansibletest_openstack(9a227eb5-2c22-41c7-a0d8-a35d821c46e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 07 15:13:09 crc kubenswrapper[4959]: > logger="UnhandledError" Oct 07 15:13:09 crc kubenswrapper[4959]: E1007 15:13:09.929202 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansibletest-ansibletest" podUID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" Oct 07 15:13:10 crc kubenswrapper[4959]: E1007 15:13:10.804020 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansibletest-ansibletest" podUID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" Oct 07 15:13:26 crc kubenswrapper[4959]: I1007 15:13:26.981154 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"9a227eb5-2c22-41c7-a0d8-a35d821c46e6","Type":"ContainerStarted","Data":"fbb1500c3e042558170f3c0dbf4e62318fef90951072343ff980e80c4355f295"} Oct 07 15:13:27 crc kubenswrapper[4959]: I1007 15:13:27.013646 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansibletest-ansibletest" podStartSLOduration=4.512936565 podStartE2EDuration="36.013615376s" podCreationTimestamp="2025-10-07 15:12:51 +0000 UTC" firstStartedPulling="2025-10-07 15:12:53.783725403 +0000 UTC m=+7925.944448080" lastFinishedPulling="2025-10-07 15:13:25.284404214 +0000 UTC m=+7957.445126891" observedRunningTime="2025-10-07 15:13:27.004946367 +0000 UTC m=+7959.165669044" watchObservedRunningTime="2025-10-07 15:13:27.013615376 +0000 UTC m=+7959.174338053" Oct 07 15:13:27 crc kubenswrapper[4959]: I1007 15:13:27.995669 4959 generic.go:334] "Generic (PLEG): container finished" podID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" containerID="fbb1500c3e042558170f3c0dbf4e62318fef90951072343ff980e80c4355f295" exitCode=0 Oct 07 15:13:27 crc kubenswrapper[4959]: I1007 15:13:27.995918 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"9a227eb5-2c22-41c7-a0d8-a35d821c46e6","Type":"ContainerDied","Data":"fbb1500c3e042558170f3c0dbf4e62318fef90951072343ff980e80c4355f295"} Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.516591 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673027 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673105 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673178 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673209 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg7c8\" (UniqueName: \"kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673231 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673251 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673312 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673425 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673549 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.673697 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\" (UID: \"9a227eb5-2c22-41c7-a0d8-a35d821c46e6\") " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.674594 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.682800 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8" (OuterVolumeSpecName: "kube-api-access-kg7c8") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "kube-api-access-kg7c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.682839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.684621 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph" (OuterVolumeSpecName: "ceph") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.691490 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.713810 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.715060 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.719286 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.749264 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.758953 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9a227eb5-2c22-41c7-a0d8-a35d821c46e6" (UID: "9a227eb5-2c22-41c7-a0d8-a35d821c46e6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777030 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777090 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777105 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777125 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777139 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg7c8\" (UniqueName: \"kubernetes.io/projected/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-kube-api-access-kg7c8\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777153 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777169 4959 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777183 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777197 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.777209 4959 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/9a227eb5-2c22-41c7-a0d8-a35d821c46e6-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.801901 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 07 15:13:29 crc kubenswrapper[4959]: I1007 15:13:29.879670 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:30 crc kubenswrapper[4959]: I1007 15:13:30.020105 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"9a227eb5-2c22-41c7-a0d8-a35d821c46e6","Type":"ContainerDied","Data":"add62b8a4e1957f89c2a1bded99889d57b66d95664a0dd0d1a0a31c31bb3abb8"} Oct 07 15:13:30 crc kubenswrapper[4959]: I1007 15:13:30.020153 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add62b8a4e1957f89c2a1bded99889d57b66d95664a0dd0d1a0a31c31bb3abb8" Oct 07 15:13:30 crc kubenswrapper[4959]: I1007 15:13:30.020220 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 07 15:13:37 crc kubenswrapper[4959]: I1007 15:13:37.891077 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 07 15:13:37 crc kubenswrapper[4959]: E1007 15:13:37.892390 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" containerName="ansibletest-ansibletest" Oct 07 15:13:37 crc kubenswrapper[4959]: I1007 15:13:37.892410 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" containerName="ansibletest-ansibletest" Oct 07 15:13:37 crc kubenswrapper[4959]: I1007 15:13:37.892704 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a227eb5-2c22-41c7-a0d8-a35d821c46e6" containerName="ansibletest-ansibletest" Oct 07 15:13:37 crc kubenswrapper[4959]: I1007 15:13:37.896214 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:37 crc kubenswrapper[4959]: I1007 15:13:37.931418 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.073074 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.074438 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8xg\" (UniqueName: \"kubernetes.io/projected/4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825-kube-api-access-zs8xg\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.177509 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8xg\" (UniqueName: \"kubernetes.io/projected/4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825-kube-api-access-zs8xg\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.177605 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.178304 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.208573 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8xg\" (UniqueName: \"kubernetes.io/projected/4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825-kube-api-access-zs8xg\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.213648 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.274321 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 07 15:13:38 crc kubenswrapper[4959]: I1007 15:13:38.746117 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 07 15:13:39 crc kubenswrapper[4959]: I1007 15:13:39.114360 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825","Type":"ContainerStarted","Data":"88a4d601338b5ccbc46e8801597531da7cab4d4c82ff12ce4b3fa635005c9a6c"} Oct 07 15:13:40 crc kubenswrapper[4959]: I1007 15:13:40.128052 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825","Type":"ContainerStarted","Data":"58a49ea892805ba1996d794de326380b7f11d4c642bcfe58cd07455e47a0a7a0"} Oct 07 15:13:40 crc kubenswrapper[4959]: I1007 15:13:40.146034 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" podStartSLOduration=2.5291085410000003 podStartE2EDuration="3.146014762s" podCreationTimestamp="2025-10-07 15:13:37 +0000 UTC" firstStartedPulling="2025-10-07 15:13:38.751535777 +0000 UTC m=+7970.912258454" lastFinishedPulling="2025-10-07 15:13:39.368441958 +0000 UTC m=+7971.529164675" observedRunningTime="2025-10-07 15:13:40.144124238 +0000 UTC m=+7972.304846915" watchObservedRunningTime="2025-10-07 15:13:40.146014762 +0000 UTC m=+7972.306737439" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.513908 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.516385 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.518477 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.520052 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizontest-tests-horizontesthorizontest-config" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.523313 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614133 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614207 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614250 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614279 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614300 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614322 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614617 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.614892 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrvpl\" (UniqueName: \"kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717401 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrvpl\" (UniqueName: \"kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717476 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717519 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717562 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717596 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717655 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717685 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.717764 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.718384 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.718455 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.718596 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.718872 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.724197 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.724452 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.725299 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.736964 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrvpl\" (UniqueName: \"kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.746249 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:57 crc kubenswrapper[4959]: I1007 15:13:57.840508 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 07 15:13:58 crc kubenswrapper[4959]: I1007 15:13:58.297910 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 07 15:13:59 crc kubenswrapper[4959]: I1007 15:13:59.329520 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"f07fca81-ce0b-4795-94ce-f4430d953e7a","Type":"ContainerStarted","Data":"d9f0b3dd8b3126c725f1df66dccb17a77a816a4792cd98453e493bd5bec74aac"} Oct 07 15:14:07 crc kubenswrapper[4959]: I1007 15:14:07.696425 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:14:07 crc kubenswrapper[4959]: I1007 15:14:07.697198 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:14:13 crc kubenswrapper[4959]: E1007 15:14:13.776326 4959 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizontest:current-podified" Oct 07 15:14:13 crc kubenswrapper[4959]: E1007 15:14:13.777154 4959 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizontest-tests-horizontest,Image:quay.io/podified-antelope-centos9/openstack-horizontest:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADMIN_PASSWORD,Value:12345678,ValueFrom:nil,},EnvVar{Name:ADMIN_USERNAME,Value:admin,ValueFrom:nil,},EnvVar{Name:AUTH_URL,Value:https://keystone-public-openstack.apps-crc.testing,ValueFrom:nil,},EnvVar{Name:DASHBOARD_URL,Value:https://horizon-openstack.apps-crc.testing/,ValueFrom:nil,},EnvVar{Name:EXTRA_FLAG,Value:not pagination and test_users.py,ValueFrom:nil,},EnvVar{Name:FLAVOR_NAME,Value:m1.tiny,ValueFrom:nil,},EnvVar{Name:HORIZONTEST_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:HORIZON_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:HORIZON_LOGS_DIR_NAME,Value:horizon,ValueFrom:nil,},EnvVar{Name:HORIZON_REPO_BRANCH,Value:master,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE,Value:/var/lib/horizontest/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE_NAME,Value:cirros-0.6.2-x86_64-disk,ValueFrom:nil,},EnvVar{Name:IMAGE_URL,Value:http://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:PASSWORD,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME_XPATH,Value://*[@class=\"context-project\"]//ancestor::ul,ValueFrom:nil,},EnvVar{Name:REPO_URL,Value:https://review.opendev.org/openstack/horizon,ValueFrom:nil,},EnvVar{Name:USER_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/horizontest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/horizontest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/horizontest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrvpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42455,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42455,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizontest-tests-horizontest_openstack(f07fca81-ce0b-4795-94ce-f4430d953e7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 15:14:13 crc kubenswrapper[4959]: E1007 15:14:13.778394 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizontest-tests-horizontest" podUID="f07fca81-ce0b-4795-94ce-f4430d953e7a" Oct 07 15:14:14 crc kubenswrapper[4959]: E1007 15:14:14.504730 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizontest:current-podified\\\"\"" pod="openstack/horizontest-tests-horizontest" podUID="f07fca81-ce0b-4795-94ce-f4430d953e7a" Oct 07 15:14:30 crc kubenswrapper[4959]: I1007 15:14:30.663989 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"f07fca81-ce0b-4795-94ce-f4430d953e7a","Type":"ContainerStarted","Data":"db06ee09c36a29319aef969c12350d7770fd0e93351948ea2cb5efb9deb5c16a"} Oct 07 15:14:30 crc kubenswrapper[4959]: I1007 15:14:30.692052 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizontest-tests-horizontest" podStartSLOduration=3.419339789 podStartE2EDuration="34.692027265s" podCreationTimestamp="2025-10-07 15:13:56 +0000 UTC" firstStartedPulling="2025-10-07 15:13:58.29937669 +0000 UTC m=+7990.460099367" lastFinishedPulling="2025-10-07 15:14:29.572064166 +0000 UTC m=+8021.732786843" observedRunningTime="2025-10-07 15:14:30.68522011 +0000 UTC m=+8022.845942787" watchObservedRunningTime="2025-10-07 15:14:30.692027265 +0000 UTC m=+8022.852749942" Oct 07 15:14:37 crc kubenswrapper[4959]: I1007 15:14:37.695349 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:14:37 crc kubenswrapper[4959]: I1007 15:14:37.695897 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.154814 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4"] Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.157723 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.160763 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.168447 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.176831 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4"] Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.258821 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.258946 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.259317 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh86f\" (UniqueName: \"kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.362228 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh86f\" (UniqueName: \"kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.362329 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.362373 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.363591 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.376542 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.383101 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh86f\" (UniqueName: \"kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f\") pod \"collect-profiles-29330835-rrnj4\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:00 crc kubenswrapper[4959]: I1007 15:15:00.528260 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:01 crc kubenswrapper[4959]: I1007 15:15:01.005401 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4"] Oct 07 15:15:01 crc kubenswrapper[4959]: I1007 15:15:01.975484 4959 generic.go:334] "Generic (PLEG): container finished" podID="5afc5410-30e1-4403-a5cd-854e8e77952e" containerID="bdd3ad94f9403f2fcac90465d4fd71f3dd23a67066dbd8ce2700082ae31c3fac" exitCode=0 Oct 07 15:15:01 crc kubenswrapper[4959]: I1007 15:15:01.975555 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" event={"ID":"5afc5410-30e1-4403-a5cd-854e8e77952e","Type":"ContainerDied","Data":"bdd3ad94f9403f2fcac90465d4fd71f3dd23a67066dbd8ce2700082ae31c3fac"} Oct 07 15:15:01 crc kubenswrapper[4959]: I1007 15:15:01.975919 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" event={"ID":"5afc5410-30e1-4403-a5cd-854e8e77952e","Type":"ContainerStarted","Data":"4dc1b97ab5e8fa05597536a22f8170854c50aad2c95adf40c25e759d8e9cfb8b"} Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.420798 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.547891 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume\") pod \"5afc5410-30e1-4403-a5cd-854e8e77952e\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.547960 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume\") pod \"5afc5410-30e1-4403-a5cd-854e8e77952e\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.548141 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh86f\" (UniqueName: \"kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f\") pod \"5afc5410-30e1-4403-a5cd-854e8e77952e\" (UID: \"5afc5410-30e1-4403-a5cd-854e8e77952e\") " Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.548886 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume" (OuterVolumeSpecName: "config-volume") pod "5afc5410-30e1-4403-a5cd-854e8e77952e" (UID: "5afc5410-30e1-4403-a5cd-854e8e77952e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.554353 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5afc5410-30e1-4403-a5cd-854e8e77952e" (UID: "5afc5410-30e1-4403-a5cd-854e8e77952e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.554839 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f" (OuterVolumeSpecName: "kube-api-access-rh86f") pod "5afc5410-30e1-4403-a5cd-854e8e77952e" (UID: "5afc5410-30e1-4403-a5cd-854e8e77952e"). InnerVolumeSpecName "kube-api-access-rh86f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.650520 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh86f\" (UniqueName: \"kubernetes.io/projected/5afc5410-30e1-4403-a5cd-854e8e77952e-kube-api-access-rh86f\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.650562 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5afc5410-30e1-4403-a5cd-854e8e77952e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.650583 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5afc5410-30e1-4403-a5cd-854e8e77952e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.996033 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" event={"ID":"5afc5410-30e1-4403-a5cd-854e8e77952e","Type":"ContainerDied","Data":"4dc1b97ab5e8fa05597536a22f8170854c50aad2c95adf40c25e759d8e9cfb8b"} Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.996355 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc1b97ab5e8fa05597536a22f8170854c50aad2c95adf40c25e759d8e9cfb8b" Oct 07 15:15:03 crc kubenswrapper[4959]: I1007 15:15:03.996090 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-rrnj4" Oct 07 15:15:04 crc kubenswrapper[4959]: I1007 15:15:04.524889 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk"] Oct 07 15:15:04 crc kubenswrapper[4959]: I1007 15:15:04.540765 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-kb2lk"] Oct 07 15:15:04 crc kubenswrapper[4959]: I1007 15:15:04.827534 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e8a0c0-5963-4abf-bc69-a015bf7e0064" path="/var/lib/kubelet/pods/59e8a0c0-5963-4abf-bc69-a015bf7e0064/volumes" Oct 07 15:15:07 crc kubenswrapper[4959]: I1007 15:15:07.696246 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:15:07 crc kubenswrapper[4959]: I1007 15:15:07.697206 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:15:07 crc kubenswrapper[4959]: I1007 15:15:07.697280 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 15:15:07 crc kubenswrapper[4959]: I1007 15:15:07.698468 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:15:07 crc kubenswrapper[4959]: I1007 15:15:07.698538 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" gracePeriod=600 Oct 07 15:15:07 crc kubenswrapper[4959]: E1007 15:15:07.822999 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:08 crc kubenswrapper[4959]: I1007 15:15:08.040295 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" exitCode=0 Oct 07 15:15:08 crc kubenswrapper[4959]: I1007 15:15:08.040845 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6"} Oct 07 15:15:08 crc kubenswrapper[4959]: I1007 15:15:08.040982 4959 scope.go:117] "RemoveContainer" containerID="a3ed63282df901adf55f77883df90781c0e6935c29dea4d25bec61214523c3c6" Oct 07 15:15:08 crc kubenswrapper[4959]: I1007 15:15:08.047897 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:15:08 crc kubenswrapper[4959]: E1007 15:15:08.049564 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:18 crc kubenswrapper[4959]: I1007 15:15:18.817953 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:15:18 crc kubenswrapper[4959]: E1007 15:15:18.818891 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:31 crc kubenswrapper[4959]: I1007 15:15:31.809789 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:15:31 crc kubenswrapper[4959]: E1007 15:15:31.811037 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:42 crc kubenswrapper[4959]: I1007 15:15:42.808913 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:15:42 crc kubenswrapper[4959]: E1007 15:15:42.809931 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:54 crc kubenswrapper[4959]: I1007 15:15:54.809466 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:15:54 crc kubenswrapper[4959]: E1007 15:15:54.810512 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:15:57 crc kubenswrapper[4959]: I1007 15:15:57.217052 4959 scope.go:117] "RemoveContainer" containerID="f42a54ed3c6e9321bb4da3c3d693b4b8a546006bca1900b6ee72fc1e0482cc40" Oct 07 15:16:06 crc kubenswrapper[4959]: I1007 15:16:06.809662 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:16:06 crc kubenswrapper[4959]: E1007 15:16:06.810907 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:16:19 crc kubenswrapper[4959]: I1007 15:16:19.809112 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:16:19 crc kubenswrapper[4959]: E1007 15:16:19.810555 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:16:25 crc kubenswrapper[4959]: I1007 15:16:25.780080 4959 generic.go:334] "Generic (PLEG): container finished" podID="f07fca81-ce0b-4795-94ce-f4430d953e7a" containerID="db06ee09c36a29319aef969c12350d7770fd0e93351948ea2cb5efb9deb5c16a" exitCode=0 Oct 07 15:16:25 crc kubenswrapper[4959]: I1007 15:16:25.780691 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"f07fca81-ce0b-4795-94ce-f4430d953e7a","Type":"ContainerDied","Data":"db06ee09c36a29319aef969c12350d7770fd0e93351948ea2cb5efb9deb5c16a"} Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.149339 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279218 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279363 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279389 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279423 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279498 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279541 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279588 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.279620 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrvpl\" (UniqueName: \"kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl\") pod \"f07fca81-ce0b-4795-94ce-f4430d953e7a\" (UID: \"f07fca81-ce0b-4795-94ce-f4430d953e7a\") " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.280739 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.286970 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph" (OuterVolumeSpecName: "ceph") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.287142 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.287583 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl" (OuterVolumeSpecName: "kube-api-access-rrvpl") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "kube-api-access-rrvpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.311532 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.333932 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.336967 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.381867 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382235 4959 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382246 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrvpl\" (UniqueName: \"kubernetes.io/projected/f07fca81-ce0b-4795-94ce-f4430d953e7a-kube-api-access-rrvpl\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382260 4959 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382269 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382277 4959 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07fca81-ce0b-4795-94ce-f4430d953e7a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.382310 4959 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.418276 4959 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.485115 4959 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.528948 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f07fca81-ce0b-4795-94ce-f4430d953e7a" (UID: "f07fca81-ce0b-4795-94ce-f4430d953e7a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.587457 4959 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f07fca81-ce0b-4795-94ce-f4430d953e7a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.811481 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"f07fca81-ce0b-4795-94ce-f4430d953e7a","Type":"ContainerDied","Data":"d9f0b3dd8b3126c725f1df66dccb17a77a816a4792cd98453e493bd5bec74aac"} Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.811528 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9f0b3dd8b3126c725f1df66dccb17a77a816a4792cd98453e493bd5bec74aac" Oct 07 15:16:27 crc kubenswrapper[4959]: I1007 15:16:27.811598 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 07 15:16:30 crc kubenswrapper[4959]: I1007 15:16:30.809887 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:16:30 crc kubenswrapper[4959]: E1007 15:16:30.810962 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.444186 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 07 15:16:32 crc kubenswrapper[4959]: E1007 15:16:32.444997 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07fca81-ce0b-4795-94ce-f4430d953e7a" containerName="horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.445011 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07fca81-ce0b-4795-94ce-f4430d953e7a" containerName="horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: E1007 15:16:32.445058 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afc5410-30e1-4403-a5cd-854e8e77952e" containerName="collect-profiles" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.445065 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afc5410-30e1-4403-a5cd-854e8e77952e" containerName="collect-profiles" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.445284 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07fca81-ce0b-4795-94ce-f4430d953e7a" containerName="horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.445303 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afc5410-30e1-4403-a5cd-854e8e77952e" containerName="collect-profiles" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.446355 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.455802 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.611130 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.611297 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfplf\" (UniqueName: \"kubernetes.io/projected/8ddae6ed-5ca7-45f7-bf73-afea2af7d7de-kube-api-access-tfplf\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.713691 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.713942 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfplf\" (UniqueName: \"kubernetes.io/projected/8ddae6ed-5ca7-45f7-bf73-afea2af7d7de-kube-api-access-tfplf\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.714489 4959 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.740089 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfplf\" (UniqueName: \"kubernetes.io/projected/8ddae6ed-5ca7-45f7-bf73-afea2af7d7de-kube-api-access-tfplf\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.741192 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: I1007 15:16:32.770027 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 07 15:16:32 crc kubenswrapper[4959]: E1007 15:16:32.770146 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:16:33 crc kubenswrapper[4959]: I1007 15:16:33.206940 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 07 15:16:33 crc kubenswrapper[4959]: E1007 15:16:33.210639 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:16:33 crc kubenswrapper[4959]: I1007 15:16:33.863213 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de","Type":"ContainerStarted","Data":"dfcf65c1d5c787e942e17df036e53d6fea10d86519e62e197b39983084f0fc82"} Oct 07 15:16:34 crc kubenswrapper[4959]: E1007 15:16:34.472872 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:16:34 crc kubenswrapper[4959]: I1007 15:16:34.886356 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"8ddae6ed-5ca7-45f7-bf73-afea2af7d7de","Type":"ContainerStarted","Data":"2862fd93f473b9b43ed7769b674dff2fcfbf51f57c038aa5416c02e7d693929c"} Oct 07 15:16:34 crc kubenswrapper[4959]: E1007 15:16:34.887035 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:16:34 crc kubenswrapper[4959]: I1007 15:16:34.902080 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" podStartSLOduration=1.641209366 podStartE2EDuration="2.902056127s" podCreationTimestamp="2025-10-07 15:16:32 +0000 UTC" firstStartedPulling="2025-10-07 15:16:33.211832465 +0000 UTC m=+8145.372555142" lastFinishedPulling="2025-10-07 15:16:34.472679226 +0000 UTC m=+8146.633401903" observedRunningTime="2025-10-07 15:16:34.898580607 +0000 UTC m=+8147.059303284" watchObservedRunningTime="2025-10-07 15:16:34.902056127 +0000 UTC m=+8147.062778804" Oct 07 15:16:35 crc kubenswrapper[4959]: E1007 15:16:35.894803 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:16:42 crc kubenswrapper[4959]: I1007 15:16:42.809541 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:16:42 crc kubenswrapper[4959]: E1007 15:16:42.811564 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:16:53 crc kubenswrapper[4959]: I1007 15:16:53.809773 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:16:53 crc kubenswrapper[4959]: E1007 15:16:53.810673 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.238527 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k527r/must-gather-kkl29"] Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.241133 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.248821 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k527r/must-gather-kkl29"] Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.257154 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k527r"/"openshift-service-ca.crt" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.257422 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k527r"/"kube-root-ca.crt" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.263995 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k527r"/"default-dockercfg-wf6sh" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.340718 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cpg\" (UniqueName: \"kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.340781 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.443016 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cpg\" (UniqueName: \"kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.443132 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.443690 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.461299 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cpg\" (UniqueName: \"kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg\") pod \"must-gather-kkl29\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:57 crc kubenswrapper[4959]: I1007 15:16:57.563695 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:16:58 crc kubenswrapper[4959]: I1007 15:16:58.011660 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k527r/must-gather-kkl29"] Oct 07 15:16:58 crc kubenswrapper[4959]: I1007 15:16:58.103526 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/must-gather-kkl29" event={"ID":"8b212cb1-0340-4e4b-8582-7b9cd9429869","Type":"ContainerStarted","Data":"2092eb7cff70febaa0edacbedde919912e79f755871d5cb657c395bb6e08e5d3"} Oct 07 15:17:04 crc kubenswrapper[4959]: I1007 15:17:04.169032 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/must-gather-kkl29" event={"ID":"8b212cb1-0340-4e4b-8582-7b9cd9429869","Type":"ContainerStarted","Data":"612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc"} Oct 07 15:17:04 crc kubenswrapper[4959]: I1007 15:17:04.170528 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/must-gather-kkl29" event={"ID":"8b212cb1-0340-4e4b-8582-7b9cd9429869","Type":"ContainerStarted","Data":"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94"} Oct 07 15:17:04 crc kubenswrapper[4959]: I1007 15:17:04.189640 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k527r/must-gather-kkl29" podStartSLOduration=1.5892779240000001 podStartE2EDuration="7.18960637s" podCreationTimestamp="2025-10-07 15:16:57 +0000 UTC" firstStartedPulling="2025-10-07 15:16:58.028049138 +0000 UTC m=+8170.188771815" lastFinishedPulling="2025-10-07 15:17:03.628377584 +0000 UTC m=+8175.789100261" observedRunningTime="2025-10-07 15:17:04.181327052 +0000 UTC m=+8176.342049729" watchObservedRunningTime="2025-10-07 15:17:04.18960637 +0000 UTC m=+8176.350329047" Oct 07 15:17:04 crc kubenswrapper[4959]: I1007 15:17:04.809641 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:17:04 crc kubenswrapper[4959]: E1007 15:17:04.810151 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:17:10 crc kubenswrapper[4959]: I1007 15:17:10.921954 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:10 crc kubenswrapper[4959]: I1007 15:17:10.926510 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:10 crc kubenswrapper[4959]: I1007 15:17:10.930168 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.073226 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.073322 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dvb\" (UniqueName: \"kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.073467 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.175559 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dvb\" (UniqueName: \"kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.175920 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.176133 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.176496 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.176614 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.196559 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dvb\" (UniqueName: \"kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb\") pod \"community-operators-nhxdr\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.249312 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:11 crc kubenswrapper[4959]: I1007 15:17:11.779557 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.238078 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k527r/crc-debug-k2jv6"] Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.239691 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.244531 4959 generic.go:334] "Generic (PLEG): container finished" podID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerID="8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6" exitCode=0 Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.244591 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerDied","Data":"8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6"} Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.245122 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerStarted","Data":"d4fb2973886c397347f5f941979a4f05f0e6f014fbf32735678f122b30df5d0b"} Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.333141 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.333655 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbpzw\" (UniqueName: \"kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.435454 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.435548 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.435577 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbpzw\" (UniqueName: \"kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.454996 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbpzw\" (UniqueName: \"kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw\") pod \"crc-debug-k2jv6\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: I1007 15:17:12.555333 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:17:12 crc kubenswrapper[4959]: W1007 15:17:12.603485 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18291fa4_7f6b_4c2a_9bb6_3cdfc27f3063.slice/crio-3062d5ba5587baba2b2f734e916c34ce3c474dd87c3617f87b9316c71bb6a2d9 WatchSource:0}: Error finding container 3062d5ba5587baba2b2f734e916c34ce3c474dd87c3617f87b9316c71bb6a2d9: Status 404 returned error can't find the container with id 3062d5ba5587baba2b2f734e916c34ce3c474dd87c3617f87b9316c71bb6a2d9 Oct 07 15:17:13 crc kubenswrapper[4959]: I1007 15:17:13.263245 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerStarted","Data":"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21"} Oct 07 15:17:13 crc kubenswrapper[4959]: I1007 15:17:13.267402 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-k2jv6" event={"ID":"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063","Type":"ContainerStarted","Data":"3062d5ba5587baba2b2f734e916c34ce3c474dd87c3617f87b9316c71bb6a2d9"} Oct 07 15:17:14 crc kubenswrapper[4959]: E1007 15:17:14.127307 4959 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.47:35976->38.102.83.47:37911: write tcp 38.102.83.47:35976->38.102.83.47:37911: write: broken pipe Oct 07 15:17:14 crc kubenswrapper[4959]: I1007 15:17:14.280953 4959 generic.go:334] "Generic (PLEG): container finished" podID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerID="07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21" exitCode=0 Oct 07 15:17:14 crc kubenswrapper[4959]: I1007 15:17:14.280999 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerDied","Data":"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21"} Oct 07 15:17:15 crc kubenswrapper[4959]: I1007 15:17:15.301908 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerStarted","Data":"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab"} Oct 07 15:17:15 crc kubenswrapper[4959]: I1007 15:17:15.326909 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nhxdr" podStartSLOduration=2.756465184 podStartE2EDuration="5.326890185s" podCreationTimestamp="2025-10-07 15:17:10 +0000 UTC" firstStartedPulling="2025-10-07 15:17:12.247683594 +0000 UTC m=+8184.408406271" lastFinishedPulling="2025-10-07 15:17:14.818108595 +0000 UTC m=+8186.978831272" observedRunningTime="2025-10-07 15:17:15.321887561 +0000 UTC m=+8187.482610238" watchObservedRunningTime="2025-10-07 15:17:15.326890185 +0000 UTC m=+8187.487612862" Oct 07 15:17:15 crc kubenswrapper[4959]: I1007 15:17:15.809071 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:17:15 crc kubenswrapper[4959]: E1007 15:17:15.810153 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:17:21 crc kubenswrapper[4959]: I1007 15:17:21.250454 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:21 crc kubenswrapper[4959]: I1007 15:17:21.251026 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:21 crc kubenswrapper[4959]: I1007 15:17:21.311296 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:21 crc kubenswrapper[4959]: I1007 15:17:21.418321 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:21 crc kubenswrapper[4959]: I1007 15:17:21.548200 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:23 crc kubenswrapper[4959]: I1007 15:17:23.382576 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nhxdr" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="registry-server" containerID="cri-o://871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab" gracePeriod=2 Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.321383 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.409130 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-k2jv6" event={"ID":"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063","Type":"ContainerStarted","Data":"786e6c17e12ff45d3b00e6729032f6752683731786e8bd0d3bc20326fc055440"} Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.419965 4959 generic.go:334] "Generic (PLEG): container finished" podID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerID="871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab" exitCode=0 Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.420386 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerDied","Data":"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab"} Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.420428 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhxdr" event={"ID":"34dea484-d6ed-4d86-aab0-e3f5984886ee","Type":"ContainerDied","Data":"d4fb2973886c397347f5f941979a4f05f0e6f014fbf32735678f122b30df5d0b"} Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.420452 4959 scope.go:117] "RemoveContainer" containerID="871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.420738 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhxdr" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.437316 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities\") pod \"34dea484-d6ed-4d86-aab0-e3f5984886ee\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.437605 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content\") pod \"34dea484-d6ed-4d86-aab0-e3f5984886ee\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.437683 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5dvb\" (UniqueName: \"kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb\") pod \"34dea484-d6ed-4d86-aab0-e3f5984886ee\" (UID: \"34dea484-d6ed-4d86-aab0-e3f5984886ee\") " Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.438959 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities" (OuterVolumeSpecName: "utilities") pod "34dea484-d6ed-4d86-aab0-e3f5984886ee" (UID: "34dea484-d6ed-4d86-aab0-e3f5984886ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.439582 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.456025 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb" (OuterVolumeSpecName: "kube-api-access-s5dvb") pod "34dea484-d6ed-4d86-aab0-e3f5984886ee" (UID: "34dea484-d6ed-4d86-aab0-e3f5984886ee"). InnerVolumeSpecName "kube-api-access-s5dvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.456094 4959 scope.go:117] "RemoveContainer" containerID="07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.489859 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34dea484-d6ed-4d86-aab0-e3f5984886ee" (UID: "34dea484-d6ed-4d86-aab0-e3f5984886ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.509352 4959 scope.go:117] "RemoveContainer" containerID="8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.540471 4959 scope.go:117] "RemoveContainer" containerID="871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab" Oct 07 15:17:25 crc kubenswrapper[4959]: E1007 15:17:25.541046 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab\": container with ID starting with 871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab not found: ID does not exist" containerID="871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541098 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab"} err="failed to get container status \"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab\": rpc error: code = NotFound desc = could not find container \"871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab\": container with ID starting with 871c05cb26c1cd4553915e31ccbc7ceb19ca8e7efef3ede87351556cbc87a8ab not found: ID does not exist" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541128 4959 scope.go:117] "RemoveContainer" containerID="07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541166 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34dea484-d6ed-4d86-aab0-e3f5984886ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541194 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5dvb\" (UniqueName: \"kubernetes.io/projected/34dea484-d6ed-4d86-aab0-e3f5984886ee-kube-api-access-s5dvb\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:25 crc kubenswrapper[4959]: E1007 15:17:25.541593 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21\": container with ID starting with 07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21 not found: ID does not exist" containerID="07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541807 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21"} err="failed to get container status \"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21\": rpc error: code = NotFound desc = could not find container \"07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21\": container with ID starting with 07d7f68625eab1ac84392425e214d58d9e9ee48643266662e57da6ee685f3f21 not found: ID does not exist" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.541846 4959 scope.go:117] "RemoveContainer" containerID="8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6" Oct 07 15:17:25 crc kubenswrapper[4959]: E1007 15:17:25.542270 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6\": container with ID starting with 8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6 not found: ID does not exist" containerID="8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.542302 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6"} err="failed to get container status \"8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6\": rpc error: code = NotFound desc = could not find container \"8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6\": container with ID starting with 8165f5f24646bb0f2b5cb21adeeff566b466bdcfd040eece378a591104bb0fe6 not found: ID does not exist" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.746205 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k527r/crc-debug-k2jv6" podStartSLOduration=1.342860202 podStartE2EDuration="13.746184497s" podCreationTimestamp="2025-10-07 15:17:12 +0000 UTC" firstStartedPulling="2025-10-07 15:17:12.609486047 +0000 UTC m=+8184.770208724" lastFinishedPulling="2025-10-07 15:17:25.012810342 +0000 UTC m=+8197.173533019" observedRunningTime="2025-10-07 15:17:25.444075097 +0000 UTC m=+8197.604797784" watchObservedRunningTime="2025-10-07 15:17:25.746184497 +0000 UTC m=+8197.906907174" Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.755953 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:25 crc kubenswrapper[4959]: I1007 15:17:25.764101 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nhxdr"] Oct 07 15:17:26 crc kubenswrapper[4959]: I1007 15:17:26.824052 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" path="/var/lib/kubelet/pods/34dea484-d6ed-4d86-aab0-e3f5984886ee/volumes" Oct 07 15:17:29 crc kubenswrapper[4959]: I1007 15:17:29.809748 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:17:29 crc kubenswrapper[4959]: E1007 15:17:29.810683 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:17:41 crc kubenswrapper[4959]: I1007 15:17:41.809527 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:17:41 crc kubenswrapper[4959]: E1007 15:17:41.810749 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:17:45 crc kubenswrapper[4959]: E1007 15:17:45.808838 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:17:52 crc kubenswrapper[4959]: I1007 15:17:52.810155 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:17:52 crc kubenswrapper[4959]: E1007 15:17:52.811299 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:17:57 crc kubenswrapper[4959]: I1007 15:17:57.362346 4959 scope.go:117] "RemoveContainer" containerID="8b48874f82a3101ab2f00562ab57f98fb9953361d8e7ad2a4e5881e1fffb7211" Oct 07 15:18:07 crc kubenswrapper[4959]: I1007 15:18:07.809985 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:18:07 crc kubenswrapper[4959]: E1007 15:18:07.810874 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:18:19 crc kubenswrapper[4959]: I1007 15:18:19.963368 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_9a227eb5-2c22-41c7-a0d8-a35d821c46e6/ansibletest-ansibletest/0.log" Oct 07 15:18:20 crc kubenswrapper[4959]: I1007 15:18:20.249650 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d8db6f568-8zwbx_80c6297a-2d51-4a7b-9da0-761f69d6f3b7/barbican-api/0.log" Oct 07 15:18:20 crc kubenswrapper[4959]: I1007 15:18:20.460119 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d8db6f568-8zwbx_80c6297a-2d51-4a7b-9da0-761f69d6f3b7/barbican-api-log/0.log" Oct 07 15:18:20 crc kubenswrapper[4959]: I1007 15:18:20.750698 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-57fd9f6674-4cfc2_97567312-2948-4f23-a1e5-da00d2689376/barbican-keystone-listener/0.log" Oct 07 15:18:21 crc kubenswrapper[4959]: I1007 15:18:21.337617 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-57fd9f6674-4cfc2_97567312-2948-4f23-a1e5-da00d2689376/barbican-keystone-listener-log/0.log" Oct 07 15:18:21 crc kubenswrapper[4959]: I1007 15:18:21.359783 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-878d55485-gnqkk_68235903-6ab3-44c7-90a1-c49f473e4568/barbican-worker/0.log" Oct 07 15:18:21 crc kubenswrapper[4959]: I1007 15:18:21.719446 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-878d55485-gnqkk_68235903-6ab3-44c7-90a1-c49f473e4568/barbican-worker-log/0.log" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.174937 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz_7be1a560-abc0-4b57-a960-85019afbe322/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.511339 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/ceilometer-central-agent/0.log" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.579269 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/ceilometer-notification-agent/0.log" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.810248 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:18:22 crc kubenswrapper[4959]: E1007 15:18:22.810967 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.844862 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/sg-core/0.log" Oct 07 15:18:22 crc kubenswrapper[4959]: I1007 15:18:22.848288 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/proxy-httpd/0.log" Oct 07 15:18:23 crc kubenswrapper[4959]: I1007 15:18:23.077293 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42_bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:23 crc kubenswrapper[4959]: I1007 15:18:23.316054 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b_090ad048-3bec-4657-b329-1fbdba663340/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:23 crc kubenswrapper[4959]: I1007 15:18:23.584453 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_54a9118f-48be-4663-ba53-6e107a5d09e8/cinder-api-log/0.log" Oct 07 15:18:23 crc kubenswrapper[4959]: I1007 15:18:23.625573 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_54a9118f-48be-4663-ba53-6e107a5d09e8/cinder-api/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.008320 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3dcdce3c-0b57-4c61-84d9-61c99ba03314/probe/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.045710 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3dcdce3c-0b57-4c61-84d9-61c99ba03314/cinder-backup/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.375893 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96e0aa23-8c42-4616-af38-0eb612e5f181/probe/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.387361 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96e0aa23-8c42-4616-af38-0eb612e5f181/cinder-scheduler/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.632138 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_ef8431b3-9196-4986-aba7-43ffefa14817/cinder-volume/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.659425 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_ef8431b3-9196-4986-aba7-43ffefa14817/probe/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.862181 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wh847_2e1533f6-5266-414d-b116-f87c2acd344a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:24 crc kubenswrapper[4959]: I1007 15:18:24.949660 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vgg98_cffeb5da-ab9c-4c47-a6e2-2e647c4ac860/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:25 crc kubenswrapper[4959]: I1007 15:18:25.154046 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/init/0.log" Oct 07 15:18:25 crc kubenswrapper[4959]: I1007 15:18:25.425463 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/init/0.log" Oct 07 15:18:25 crc kubenswrapper[4959]: I1007 15:18:25.689181 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/dnsmasq-dns/0.log" Oct 07 15:18:25 crc kubenswrapper[4959]: I1007 15:18:25.717561 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77ff234e-dd31-4847-8517-4befe98845f7/glance-httpd/0.log" Oct 07 15:18:25 crc kubenswrapper[4959]: I1007 15:18:25.765062 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77ff234e-dd31-4847-8517-4befe98845f7/glance-log/0.log" Oct 07 15:18:26 crc kubenswrapper[4959]: I1007 15:18:26.209999 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cc40f402-6581-45a7-945f-a64d217724ab/glance-httpd/0.log" Oct 07 15:18:26 crc kubenswrapper[4959]: I1007 15:18:26.229610 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cc40f402-6581-45a7-945f-a64d217724ab/glance-log/0.log" Oct 07 15:18:26 crc kubenswrapper[4959]: I1007 15:18:26.542249 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc68dfcf6-xkrw7_41b4db91-ead3-4028-b30c-e3e726ae6f1e/horizon/0.log" Oct 07 15:18:26 crc kubenswrapper[4959]: I1007 15:18:26.870873 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_f07fca81-ce0b-4795-94ce-f4430d953e7a/horizontest-tests-horizontest/0.log" Oct 07 15:18:27 crc kubenswrapper[4959]: I1007 15:18:27.150134 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg_1c5e92bc-6eae-4ed1-81e8-400019fc8a13/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:27 crc kubenswrapper[4959]: I1007 15:18:27.448145 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7b997_5e262fa9-5abf-4283-99ed-ead5affb1282/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:27 crc kubenswrapper[4959]: I1007 15:18:27.772371 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc68dfcf6-xkrw7_41b4db91-ead3-4028-b30c-e3e726ae6f1e/horizon-log/0.log" Oct 07 15:18:28 crc kubenswrapper[4959]: I1007 15:18:28.152189 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330761-j7cjf_0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40/keystone-cron/0.log" Oct 07 15:18:28 crc kubenswrapper[4959]: I1007 15:18:28.486447 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330821-7g47d_95a09836-b1d0-4b20-8b66-13cadce981d6/keystone-cron/0.log" Oct 07 15:18:28 crc kubenswrapper[4959]: I1007 15:18:28.742033 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_44961788-4f6e-4912-a20e-4648a7760dce/kube-state-metrics/0.log" Oct 07 15:18:29 crc kubenswrapper[4959]: I1007 15:18:29.081722 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb_dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:29 crc kubenswrapper[4959]: I1007 15:18:29.602953 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_640ef79d-5203-4f5e-8119-2f1eecb02bf1/manila-api-log/0.log" Oct 07 15:18:29 crc kubenswrapper[4959]: I1007 15:18:29.635917 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_640ef79d-5203-4f5e-8119-2f1eecb02bf1/manila-api/0.log" Oct 07 15:18:30 crc kubenswrapper[4959]: I1007 15:18:30.028451 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-54f9969c74-l8zmx_969d49d0-51dc-47c4-a4fb-aba1b09f4a6a/keystone-api/0.log" Oct 07 15:18:30 crc kubenswrapper[4959]: I1007 15:18:30.066024 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_45ac094a-4d13-4664-94e2-149bdb7b4548/manila-scheduler/0.log" Oct 07 15:18:30 crc kubenswrapper[4959]: I1007 15:18:30.196117 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_45ac094a-4d13-4664-94e2-149bdb7b4548/probe/0.log" Oct 07 15:18:30 crc kubenswrapper[4959]: I1007 15:18:30.669315 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8e08ebef-1b6b-4040-8b0f-7c841e191363/manila-share/0.log" Oct 07 15:18:30 crc kubenswrapper[4959]: I1007 15:18:30.774476 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8e08ebef-1b6b-4040-8b0f-7c841e191363/probe/0.log" Oct 07 15:18:32 crc kubenswrapper[4959]: I1007 15:18:32.591490 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-86cfbf9b4f-pxglw_46472ab2-866f-4b3c-b030-7b05d02f9176/neutron-httpd/0.log" Oct 07 15:18:33 crc kubenswrapper[4959]: I1007 15:18:33.145457 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-86cfbf9b4f-pxglw_46472ab2-866f-4b3c-b030-7b05d02f9176/neutron-api/0.log" Oct 07 15:18:33 crc kubenswrapper[4959]: I1007 15:18:33.257113 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx_fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:37 crc kubenswrapper[4959]: I1007 15:18:37.809215 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:18:37 crc kubenswrapper[4959]: E1007 15:18:37.810304 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:18:38 crc kubenswrapper[4959]: I1007 15:18:38.276150 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e/nova-api-log/0.log" Oct 07 15:18:39 crc kubenswrapper[4959]: I1007 15:18:39.298480 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e/nova-api-api/0.log" Oct 07 15:18:39 crc kubenswrapper[4959]: I1007 15:18:39.374047 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bcabf204-0890-4bfc-9a94-b921b3011603/nova-cell0-conductor-conductor/0.log" Oct 07 15:18:39 crc kubenswrapper[4959]: I1007 15:18:39.735872 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8446fa80-aebe-45ba-a6a7-4f51402f3d38/nova-cell1-conductor-conductor/0.log" Oct 07 15:18:40 crc kubenswrapper[4959]: I1007 15:18:40.446787 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9addbd40-1800-4967-bb06-7a90697034dd/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 15:18:40 crc kubenswrapper[4959]: I1007 15:18:40.826461 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw_0ebc66fe-ebad-47d5-93df-fbff665959d9/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:41 crc kubenswrapper[4959]: I1007 15:18:41.092547 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_627022b9-2219-4bcd-a001-53bf9e863c14/nova-metadata-log/0.log" Oct 07 15:18:42 crc kubenswrapper[4959]: I1007 15:18:42.295230 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a46a9782-e96e-432c-b2e8-c7863291485e/nova-scheduler-scheduler/0.log" Oct 07 15:18:42 crc kubenswrapper[4959]: I1007 15:18:42.805780 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/mysql-bootstrap/0.log" Oct 07 15:18:43 crc kubenswrapper[4959]: I1007 15:18:43.036152 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/mysql-bootstrap/0.log" Oct 07 15:18:43 crc kubenswrapper[4959]: I1007 15:18:43.323286 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/galera/0.log" Oct 07 15:18:43 crc kubenswrapper[4959]: I1007 15:18:43.845564 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/mysql-bootstrap/0.log" Oct 07 15:18:44 crc kubenswrapper[4959]: I1007 15:18:44.340991 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/mysql-bootstrap/0.log" Oct 07 15:18:44 crc kubenswrapper[4959]: I1007 15:18:44.582499 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/galera/0.log" Oct 07 15:18:45 crc kubenswrapper[4959]: I1007 15:18:45.061050 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_eab0abf5-c944-4a5c-9259-6dc0ea2b115f/openstackclient/0.log" Oct 07 15:18:45 crc kubenswrapper[4959]: I1007 15:18:45.519607 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lqwt9_dde9002d-236f-4dc3-947e-98e1e4e535c1/openstack-network-exporter/0.log" Oct 07 15:18:45 crc kubenswrapper[4959]: I1007 15:18:45.577441 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_627022b9-2219-4bcd-a001-53bf9e863c14/nova-metadata-metadata/0.log" Oct 07 15:18:45 crc kubenswrapper[4959]: I1007 15:18:45.804224 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server-init/0.log" Oct 07 15:18:45 crc kubenswrapper[4959]: I1007 15:18:45.991325 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server-init/0.log" Oct 07 15:18:46 crc kubenswrapper[4959]: I1007 15:18:46.163875 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovs-vswitchd/0.log" Oct 07 15:18:46 crc kubenswrapper[4959]: I1007 15:18:46.207179 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server/0.log" Oct 07 15:18:46 crc kubenswrapper[4959]: I1007 15:18:46.403180 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-z8f9v_907772e5-2f0c-4478-9d3b-8f82eec8f258/ovn-controller/0.log" Oct 07 15:18:46 crc kubenswrapper[4959]: I1007 15:18:46.727587 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-w2tmp_153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:46 crc kubenswrapper[4959]: I1007 15:18:46.928218 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b18eda78-12ab-4cb2-ac1c-56907a2b4667/openstack-network-exporter/0.log" Oct 07 15:18:47 crc kubenswrapper[4959]: I1007 15:18:47.190412 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b18eda78-12ab-4cb2-ac1c-56907a2b4667/ovn-northd/0.log" Oct 07 15:18:47 crc kubenswrapper[4959]: I1007 15:18:47.426350 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_26d915bf-8d27-4349-9a3b-f13f13809cf5/ovsdbserver-nb/0.log" Oct 07 15:18:47 crc kubenswrapper[4959]: I1007 15:18:47.448994 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_26d915bf-8d27-4349-9a3b-f13f13809cf5/openstack-network-exporter/0.log" Oct 07 15:18:47 crc kubenswrapper[4959]: I1007 15:18:47.730946 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_151d32f4-496d-43a0-aeb7-ee999d5faeef/openstack-network-exporter/0.log" Oct 07 15:18:47 crc kubenswrapper[4959]: E1007 15:18:47.809439 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:18:47 crc kubenswrapper[4959]: I1007 15:18:47.959269 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_151d32f4-496d-43a0-aeb7-ee999d5faeef/ovsdbserver-sb/0.log" Oct 07 15:18:48 crc kubenswrapper[4959]: I1007 15:18:48.896734 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58674f758b-wncml_ef3ca2a1-1eed-47fc-8454-47decce134d5/placement-api/0.log" Oct 07 15:18:49 crc kubenswrapper[4959]: I1007 15:18:49.065986 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58674f758b-wncml_ef3ca2a1-1eed-47fc-8454-47decce134d5/placement-log/0.log" Oct 07 15:18:49 crc kubenswrapper[4959]: I1007 15:18:49.309215 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/setup-container/0.log" Oct 07 15:18:49 crc kubenswrapper[4959]: I1007 15:18:49.567765 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/setup-container/0.log" Oct 07 15:18:49 crc kubenswrapper[4959]: I1007 15:18:49.593540 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/rabbitmq/0.log" Oct 07 15:18:49 crc kubenswrapper[4959]: I1007 15:18:49.818726 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/setup-container/0.log" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.067978 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/setup-container/0.log" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.077347 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/rabbitmq/0.log" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.370869 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv_1d2aa3cc-f250-4d9e-b6da-921018115809/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.668340 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs_1522ab05-1ecc-4aad-8196-557397dd2ebf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.809412 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:18:50 crc kubenswrapper[4959]: E1007 15:18:50.809791 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:18:50 crc kubenswrapper[4959]: I1007 15:18:50.952870 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-t4pk5_feecb62b-99f0-41a7-80ce-3e8538801512/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:51 crc kubenswrapper[4959]: I1007 15:18:51.233049 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5cbpb_e4d99350-2d4f-451a-a539-e7a72f41ad3a/ssh-known-hosts-edpm-deployment/0.log" Oct 07 15:18:51 crc kubenswrapper[4959]: I1007 15:18:51.611198 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_b14b7636-6093-478a-945a-a512ef1935b4/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:18:51 crc kubenswrapper[4959]: I1007 15:18:51.890515 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:18:52 crc kubenswrapper[4959]: I1007 15:18:52.109562 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825/test-operator-logs-container/0.log" Oct 07 15:18:52 crc kubenswrapper[4959]: I1007 15:18:52.356004 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_8ddae6ed-5ca7-45f7-bf73-afea2af7d7de/test-operator-logs-container/0.log" Oct 07 15:18:52 crc kubenswrapper[4959]: I1007 15:18:52.594703 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2611594a-b816-4cdc-b55b-d6ac6e281071/test-operator-logs-container/0.log" Oct 07 15:18:52 crc kubenswrapper[4959]: I1007 15:18:52.834181 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_a85e3cec-d699-4a9f-9da3-809799b06f1c/test-operator-logs-container/0.log" Oct 07 15:18:53 crc kubenswrapper[4959]: I1007 15:18:53.062688 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_36d2c381-4cb1-4b35-b315-d1d4847f70c7/tobiko-tests-tobiko/0.log" Oct 07 15:18:53 crc kubenswrapper[4959]: I1007 15:18:53.315853 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_6d961e86-0037-4c2a-ac1f-b73c10339406/tobiko-tests-tobiko/0.log" Oct 07 15:18:53 crc kubenswrapper[4959]: I1007 15:18:53.575709 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs_0d3a592c-85aa-455c-a39e-cf2ec5c1f292/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:53 crc kubenswrapper[4959]: I1007 15:18:53.682611 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_72f6396e-c1ff-485b-8878-33f9ab5dc874/memcached/0.log" Oct 07 15:18:57 crc kubenswrapper[4959]: I1007 15:18:57.448380 4959 scope.go:117] "RemoveContainer" containerID="8251cd1b5d3f7bf2cf8e85ac95d29b474da2e9801bb27513b5585704dc25de35" Oct 07 15:18:57 crc kubenswrapper[4959]: I1007 15:18:57.485530 4959 scope.go:117] "RemoveContainer" containerID="aa9951be147a16edab982de8de244a401e61e52a0c3ff5b2cd112cec1d555953" Oct 07 15:19:05 crc kubenswrapper[4959]: I1007 15:19:05.809097 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:19:05 crc kubenswrapper[4959]: E1007 15:19:05.809936 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:19:17 crc kubenswrapper[4959]: I1007 15:19:17.809761 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:19:17 crc kubenswrapper[4959]: E1007 15:19:17.810592 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:19:29 crc kubenswrapper[4959]: I1007 15:19:29.810040 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:19:29 crc kubenswrapper[4959]: E1007 15:19:29.810862 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:19:44 crc kubenswrapper[4959]: I1007 15:19:44.810294 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:19:44 crc kubenswrapper[4959]: E1007 15:19:44.811146 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:19:59 crc kubenswrapper[4959]: I1007 15:19:59.811155 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:19:59 crc kubenswrapper[4959]: E1007 15:19:59.812163 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:20:02 crc kubenswrapper[4959]: I1007 15:20:02.097467 4959 generic.go:334] "Generic (PLEG): container finished" podID="18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" containerID="786e6c17e12ff45d3b00e6729032f6752683731786e8bd0d3bc20326fc055440" exitCode=0 Oct 07 15:20:02 crc kubenswrapper[4959]: I1007 15:20:02.097655 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-k2jv6" event={"ID":"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063","Type":"ContainerDied","Data":"786e6c17e12ff45d3b00e6729032f6752683731786e8bd0d3bc20326fc055440"} Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.300866 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.338852 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbpzw\" (UniqueName: \"kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw\") pod \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.338944 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host\") pod \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\" (UID: \"18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063\") " Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.339550 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host" (OuterVolumeSpecName: "host") pod "18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" (UID: "18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.372712 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k527r/crc-debug-k2jv6"] Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.373848 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw" (OuterVolumeSpecName: "kube-api-access-gbpzw") pod "18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" (UID: "18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063"). InnerVolumeSpecName "kube-api-access-gbpzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.385992 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k527r/crc-debug-k2jv6"] Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.442143 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbpzw\" (UniqueName: \"kubernetes.io/projected/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-kube-api-access-gbpzw\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:03 crc kubenswrapper[4959]: I1007 15:20:03.442169 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.126652 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3062d5ba5587baba2b2f734e916c34ce3c474dd87c3617f87b9316c71bb6a2d9" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.126793 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-k2jv6" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.751549 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k527r/crc-debug-9vwkx"] Oct 07 15:20:04 crc kubenswrapper[4959]: E1007 15:20:04.752689 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" containerName="container-00" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.752708 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" containerName="container-00" Oct 07 15:20:04 crc kubenswrapper[4959]: E1007 15:20:04.752735 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="registry-server" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.752744 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="registry-server" Oct 07 15:20:04 crc kubenswrapper[4959]: E1007 15:20:04.752758 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="extract-content" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.752767 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="extract-content" Oct 07 15:20:04 crc kubenswrapper[4959]: E1007 15:20:04.752797 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="extract-utilities" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.752804 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="extract-utilities" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.753070 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dea484-d6ed-4d86-aab0-e3f5984886ee" containerName="registry-server" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.753088 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" containerName="container-00" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.754092 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.771198 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwcgp\" (UniqueName: \"kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.771606 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.824841 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063" path="/var/lib/kubelet/pods/18291fa4-7f6b-4c2a-9bb6-3cdfc27f3063/volumes" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.873004 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.873121 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.873161 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwcgp\" (UniqueName: \"kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:04 crc kubenswrapper[4959]: I1007 15:20:04.902345 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwcgp\" (UniqueName: \"kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp\") pod \"crc-debug-9vwkx\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:05 crc kubenswrapper[4959]: I1007 15:20:05.080891 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:05 crc kubenswrapper[4959]: E1007 15:20:05.810171 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:20:06 crc kubenswrapper[4959]: I1007 15:20:06.153069 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-9vwkx" event={"ID":"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a","Type":"ContainerStarted","Data":"564ef2224ae46f7fc7ffc9a5f87a96162bd785a353142f80e0aedd81cec1942a"} Oct 07 15:20:06 crc kubenswrapper[4959]: I1007 15:20:06.153568 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-9vwkx" event={"ID":"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a","Type":"ContainerStarted","Data":"b702871839109aa05ff0966b93629548f3dee4281f7fbf386e132a2eef5e8a49"} Oct 07 15:20:06 crc kubenswrapper[4959]: I1007 15:20:06.180440 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k527r/crc-debug-9vwkx" podStartSLOduration=2.180399356 podStartE2EDuration="2.180399356s" podCreationTimestamp="2025-10-07 15:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:20:06.168070882 +0000 UTC m=+8358.328793569" watchObservedRunningTime="2025-10-07 15:20:06.180399356 +0000 UTC m=+8358.341122053" Oct 07 15:20:07 crc kubenswrapper[4959]: I1007 15:20:07.176206 4959 generic.go:334] "Generic (PLEG): container finished" podID="91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" containerID="564ef2224ae46f7fc7ffc9a5f87a96162bd785a353142f80e0aedd81cec1942a" exitCode=0 Oct 07 15:20:07 crc kubenswrapper[4959]: I1007 15:20:07.176285 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-9vwkx" event={"ID":"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a","Type":"ContainerDied","Data":"564ef2224ae46f7fc7ffc9a5f87a96162bd785a353142f80e0aedd81cec1942a"} Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.310206 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.448599 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host\") pod \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.448683 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwcgp\" (UniqueName: \"kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp\") pod \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\" (UID: \"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a\") " Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.448702 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host" (OuterVolumeSpecName: "host") pod "91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" (UID: "91eaed04-b311-4daa-8fd4-f8c4a8b04d6a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.449259 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.458057 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp" (OuterVolumeSpecName: "kube-api-access-zwcgp") pod "91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" (UID: "91eaed04-b311-4daa-8fd4-f8c4a8b04d6a"). InnerVolumeSpecName "kube-api-access-zwcgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:20:08 crc kubenswrapper[4959]: I1007 15:20:08.550878 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwcgp\" (UniqueName: \"kubernetes.io/projected/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a-kube-api-access-zwcgp\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:09 crc kubenswrapper[4959]: I1007 15:20:09.199973 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-9vwkx" event={"ID":"91eaed04-b311-4daa-8fd4-f8c4a8b04d6a","Type":"ContainerDied","Data":"b702871839109aa05ff0966b93629548f3dee4281f7fbf386e132a2eef5e8a49"} Oct 07 15:20:09 crc kubenswrapper[4959]: I1007 15:20:09.200378 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b702871839109aa05ff0966b93629548f3dee4281f7fbf386e132a2eef5e8a49" Oct 07 15:20:09 crc kubenswrapper[4959]: I1007 15:20:09.200182 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-9vwkx" Oct 07 15:20:13 crc kubenswrapper[4959]: I1007 15:20:13.808764 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:20:14 crc kubenswrapper[4959]: I1007 15:20:14.253963 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14"} Oct 07 15:20:18 crc kubenswrapper[4959]: I1007 15:20:18.219577 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k527r/crc-debug-9vwkx"] Oct 07 15:20:18 crc kubenswrapper[4959]: I1007 15:20:18.233444 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k527r/crc-debug-9vwkx"] Oct 07 15:20:18 crc kubenswrapper[4959]: I1007 15:20:18.826431 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" path="/var/lib/kubelet/pods/91eaed04-b311-4daa-8fd4-f8c4a8b04d6a/volumes" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.437076 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k527r/crc-debug-nk445"] Oct 07 15:20:19 crc kubenswrapper[4959]: E1007 15:20:19.437758 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" containerName="container-00" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.437775 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" containerName="container-00" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.438032 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="91eaed04-b311-4daa-8fd4-f8c4a8b04d6a" containerName="container-00" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.438916 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.609962 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.610326 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvl6\" (UniqueName: \"kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.713029 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvl6\" (UniqueName: \"kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.713797 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.713897 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.741520 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvl6\" (UniqueName: \"kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6\") pod \"crc-debug-nk445\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:19 crc kubenswrapper[4959]: I1007 15:20:19.762499 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:20 crc kubenswrapper[4959]: I1007 15:20:20.323165 4959 generic.go:334] "Generic (PLEG): container finished" podID="82627dec-8c77-4ad8-b696-8c51d938272f" containerID="84c5a6eff1d714e00db6d64c735c01b9737ff3c7b183a306f67172531fcc466d" exitCode=0 Oct 07 15:20:20 crc kubenswrapper[4959]: I1007 15:20:20.323254 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-nk445" event={"ID":"82627dec-8c77-4ad8-b696-8c51d938272f","Type":"ContainerDied","Data":"84c5a6eff1d714e00db6d64c735c01b9737ff3c7b183a306f67172531fcc466d"} Oct 07 15:20:20 crc kubenswrapper[4959]: I1007 15:20:20.323514 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/crc-debug-nk445" event={"ID":"82627dec-8c77-4ad8-b696-8c51d938272f","Type":"ContainerStarted","Data":"16369966e2b380656af009a9762b3037d30c6d26177e5fb9b7aca87b9f00d98f"} Oct 07 15:20:20 crc kubenswrapper[4959]: I1007 15:20:20.369346 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k527r/crc-debug-nk445"] Oct 07 15:20:20 crc kubenswrapper[4959]: I1007 15:20:20.378887 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k527r/crc-debug-nk445"] Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.475750 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.553964 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rvl6\" (UniqueName: \"kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6\") pod \"82627dec-8c77-4ad8-b696-8c51d938272f\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.554387 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host\") pod \"82627dec-8c77-4ad8-b696-8c51d938272f\" (UID: \"82627dec-8c77-4ad8-b696-8c51d938272f\") " Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.554491 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host" (OuterVolumeSpecName: "host") pod "82627dec-8c77-4ad8-b696-8c51d938272f" (UID: "82627dec-8c77-4ad8-b696-8c51d938272f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.555229 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82627dec-8c77-4ad8-b696-8c51d938272f-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.560687 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6" (OuterVolumeSpecName: "kube-api-access-2rvl6") pod "82627dec-8c77-4ad8-b696-8c51d938272f" (UID: "82627dec-8c77-4ad8-b696-8c51d938272f"). InnerVolumeSpecName "kube-api-access-2rvl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:20:21 crc kubenswrapper[4959]: I1007 15:20:21.657250 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rvl6\" (UniqueName: \"kubernetes.io/projected/82627dec-8c77-4ad8-b696-8c51d938272f-kube-api-access-2rvl6\") on node \"crc\" DevicePath \"\"" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.054523 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.239413 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.292945 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.293012 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.362333 4959 scope.go:117] "RemoveContainer" containerID="84c5a6eff1d714e00db6d64c735c01b9737ff3c7b183a306f67172531fcc466d" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.362385 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/crc-debug-nk445" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.531664 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.553969 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.555877 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/extract/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.752174 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f56ff694-b4rhk_c2a805f1-946a-4b48-9e52-4f24b56bd43a/kube-rbac-proxy/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.815420 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f56ff694-b4rhk_c2a805f1-946a-4b48-9e52-4f24b56bd43a/manager/0.log" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.826014 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82627dec-8c77-4ad8-b696-8c51d938272f" path="/var/lib/kubelet/pods/82627dec-8c77-4ad8-b696-8c51d938272f/volumes" Oct 07 15:20:22 crc kubenswrapper[4959]: I1007 15:20:22.845696 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-zl4v9_3e59fb97-6ef4-42a5-a264-506bdccd8a23/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.015908 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-xmpkg_01b13867-f984-4d88-af12-28fc3ebc0b9f/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.037305 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-zl4v9_3e59fb97-6ef4-42a5-a264-506bdccd8a23/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.101692 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-xmpkg_01b13867-f984-4d88-af12-28fc3ebc0b9f/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.318097 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-fd648f65-rmk5h_035c3aeb-396b-47bf-a588-562bb0f27f88/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.383127 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-fd648f65-rmk5h_035c3aeb-396b-47bf-a588-562bb0f27f88/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.550198 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7ccfc8cf49-d4g6g_4294ed44-d412-4366-959e-cb534ab792bc/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.560377 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7ccfc8cf49-d4g6g_4294ed44-d412-4366-959e-cb534ab792bc/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.642189 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b477879bc-nf6mt_539702ff-226a-4c31-b715-af9af8ae1205/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.760477 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b477879bc-nf6mt_539702ff-226a-4c31-b715-af9af8ae1205/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4959]: I1007 15:20:23.814823 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-d772s_e3213b12-9128-4d7c-8ec8-a731e6627de4/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.038306 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-d772s_e3213b12-9128-4d7c-8ec8-a731e6627de4/manager/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.079194 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5467f8988c-6t98f_749f8ff6-9e1c-45ef-948f-1f8c255b670e/manager/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.096908 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5467f8988c-6t98f_749f8ff6-9e1c-45ef-948f-1f8c255b670e/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.245715 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5b84cc7657-r57lc_77bcfec2-4667-4415-af5e-3009e5ea4999/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.350235 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5b84cc7657-r57lc_77bcfec2-4667-4415-af5e-3009e5ea4999/manager/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.452877 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-26qz6_6e224af6-7095-4878-ba65-3a8e3f358968/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.518654 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-26qz6_6e224af6-7095-4878-ba65-3a8e3f358968/manager/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.620861 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-6kq7p_5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.716353 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-6kq7p_5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85/manager/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.800280 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-6tcd6_eea6d6d3-ded0-4788-8901-34c02d659aee/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4959]: I1007 15:20:24.889087 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-6tcd6_eea6d6d3-ded0-4788-8901-34c02d659aee/manager/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.043910 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-t2wjn_64d46cbf-e1a4-4673-9f0e-01371175a1f9/kube-rbac-proxy/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.106382 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-t2wjn_64d46cbf-e1a4-4673-9f0e-01371175a1f9/manager/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.243945 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-sfhzx_171d0807-668d-4284-ab63-698401676fbe/manager/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.249188 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-sfhzx_171d0807-668d-4284-ab63-698401676fbe/kube-rbac-proxy/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.400452 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw_0e606b13-be7c-4699-bb4b-5c50ddf32426/kube-rbac-proxy/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.458111 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw_0e606b13-be7c-4699-bb4b-5c50ddf32426/manager/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.571525 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fd79fd9-ktzrv_370cd57f-855c-4584-a0c1-c806f93bd8d7/kube-rbac-proxy/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.704660 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86c7c896d7-mzwlr_88b5404c-1e9b-42c9-9c21-fb32b136db86/kube-rbac-proxy/0.log" Oct 07 15:20:25 crc kubenswrapper[4959]: I1007 15:20:25.777244 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86c7c896d7-mzwlr_88b5404c-1e9b-42c9-9c21-fb32b136db86/operator/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.007342 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54d485fd9-2vwpz_f8ddf44b-e556-40c6-a3f8-699d756434dd/kube-rbac-proxy/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.210972 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vptcw_8a32157a-8fdd-4430-9d22-3401166e4352/registry-server/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.217832 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54d485fd9-2vwpz_f8ddf44b-e556-40c6-a3f8-699d756434dd/manager/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.222120 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-nkfnv_d8ff35a5-f26c-4077-bdad-baa63159c6e4/kube-rbac-proxy/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.456729 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-nkfnv_d8ff35a5-f26c-4077-bdad-baa63159c6e4/manager/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.473993 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp_1f48e97d-5d4f-49d3-b550-d51242109806/operator/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.756743 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-k6btb_86969b11-9037-4890-93dc-575b83669d0f/kube-rbac-proxy/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.766034 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-sp68w_cac1fe47-f06a-44fb-b4fe-a19faa802cca/kube-rbac-proxy/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.792961 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-k6btb_86969b11-9037-4890-93dc-575b83669d0f/manager/0.log" Oct 07 15:20:26 crc kubenswrapper[4959]: I1007 15:20:26.939199 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fd79fd9-ktzrv_370cd57f-855c-4584-a0c1-c806f93bd8d7/manager/0.log" Oct 07 15:20:27 crc kubenswrapper[4959]: I1007 15:20:27.018837 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55c6894594-6pn9s_6108c0b3-e7a9-412c-9085-0eea09f342c6/kube-rbac-proxy/0.log" Oct 07 15:20:27 crc kubenswrapper[4959]: I1007 15:20:27.092032 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-sp68w_cac1fe47-f06a-44fb-b4fe-a19faa802cca/manager/0.log" Oct 07 15:20:27 crc kubenswrapper[4959]: I1007 15:20:27.094223 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55c6894594-6pn9s_6108c0b3-e7a9-412c-9085-0eea09f342c6/manager/0.log" Oct 07 15:20:27 crc kubenswrapper[4959]: I1007 15:20:27.240143 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-rgw4d_1062e16d-6129-48d2-a385-d988ac5fe4f7/kube-rbac-proxy/0.log" Oct 07 15:20:27 crc kubenswrapper[4959]: I1007 15:20:27.266342 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-rgw4d_1062e16d-6129-48d2-a385-d988ac5fe4f7/manager/0.log" Oct 07 15:20:44 crc kubenswrapper[4959]: I1007 15:20:44.211508 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w4dpf_f8122128-1530-410d-a26b-068922cea39b/control-plane-machine-set-operator/0.log" Oct 07 15:20:44 crc kubenswrapper[4959]: I1007 15:20:44.431130 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c5bnk_d0203e72-df97-4a97-8f45-65175f7d9839/machine-api-operator/0.log" Oct 07 15:20:44 crc kubenswrapper[4959]: I1007 15:20:44.441120 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c5bnk_d0203e72-df97-4a97-8f45-65175f7d9839/kube-rbac-proxy/0.log" Oct 07 15:20:56 crc kubenswrapper[4959]: I1007 15:20:56.548319 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-d6468_fe93de9f-3c30-4373-bc80-912dd219d1f9/cert-manager-controller/0.log" Oct 07 15:20:56 crc kubenswrapper[4959]: I1007 15:20:56.753533 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-585vd_897ad114-2a60-468e-8c81-2367ded7fe7b/cert-manager-cainjector/0.log" Oct 07 15:20:56 crc kubenswrapper[4959]: I1007 15:20:56.786051 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hp7q8_617c6991-922b-4bd1-b578-2327061ba973/cert-manager-webhook/0.log" Oct 07 15:21:08 crc kubenswrapper[4959]: I1007 15:21:08.650158 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-bc4dg_e91d7e0c-6f6c-4305-88f1-316fda279894/nmstate-console-plugin/0.log" Oct 07 15:21:08 crc kubenswrapper[4959]: I1007 15:21:08.860478 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-trnlr_0ce82e41-c87c-4a6a-85c5-63fa6986a917/nmstate-handler/0.log" Oct 07 15:21:08 crc kubenswrapper[4959]: I1007 15:21:08.871483 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-bqt8v_b74adf76-6b8e-4df4-a786-b241afc85aaf/kube-rbac-proxy/0.log" Oct 07 15:21:08 crc kubenswrapper[4959]: I1007 15:21:08.934680 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-bqt8v_b74adf76-6b8e-4df4-a786-b241afc85aaf/nmstate-metrics/0.log" Oct 07 15:21:09 crc kubenswrapper[4959]: I1007 15:21:09.069042 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-qxq76_ecfe0080-5a6d-4580-957f-9b07016a6f38/nmstate-operator/0.log" Oct 07 15:21:09 crc kubenswrapper[4959]: I1007 15:21:09.157213 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-648vt_b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb/nmstate-webhook/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.044420 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7b7r7_ec5e2185-a03f-459b-95ce-cf8a04c9742d/kube-rbac-proxy/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.171634 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7b7r7_ec5e2185-a03f-459b-95ce-cf8a04c9742d/controller/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.250071 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.424582 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.434551 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.446867 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.446921 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.644861 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.666468 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.680916 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.704260 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.860927 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.872445 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.890592 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:21:22 crc kubenswrapper[4959]: I1007 15:21:22.905884 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/controller/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.065227 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/frr-metrics/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.094539 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/kube-rbac-proxy/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.131657 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/kube-rbac-proxy-frr/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.297475 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/reloader/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.419042 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-thbcx_ea4413f6-7433-4301-856f-51073cbf20b0/frr-k8s-webhook-server/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.640268 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79dcdc88ff-jv2fl_c2450601-6bf6-4ee8-af46-dafc0db98d8c/manager/0.log" Oct 07 15:21:23 crc kubenswrapper[4959]: I1007 15:21:23.838243 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-678c485567-fb7ps_d1fbc67f-df69-48a4-87a0-e9d429eca6f1/webhook-server/0.log" Oct 07 15:21:24 crc kubenswrapper[4959]: I1007 15:21:24.007396 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dgrrl_4a978343-2c48-4153-a20e-631bbe3c1595/kube-rbac-proxy/0.log" Oct 07 15:21:24 crc kubenswrapper[4959]: I1007 15:21:24.577878 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dgrrl_4a978343-2c48-4153-a20e-631bbe3c1595/speaker/0.log" Oct 07 15:21:25 crc kubenswrapper[4959]: I1007 15:21:25.279857 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/frr/0.log" Oct 07 15:21:29 crc kubenswrapper[4959]: E1007 15:21:29.809491 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:21:35 crc kubenswrapper[4959]: I1007 15:21:35.993107 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.201196 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.211421 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.249384 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.464487 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.466040 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.493796 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/extract/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.630989 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.777581 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.800032 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.832512 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:21:36 crc kubenswrapper[4959]: I1007 15:21:36.997973 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.039496 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.236553 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.246738 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/registry-server/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.378110 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.402178 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.430305 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.656964 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.687519 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:21:37 crc kubenswrapper[4959]: I1007 15:21:37.951507 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.160759 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.211876 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.252401 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.539605 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.550578 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/extract/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.553675 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.834322 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/registry-server/0.log" Oct 07 15:21:38 crc kubenswrapper[4959]: I1007 15:21:38.847216 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fbtxf_0548f538-781a-406b-8d2c-4449281cc77c/marketplace-operator/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.008872 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.145833 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.164183 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.183897 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.412240 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.412495 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.624783 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.755028 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/registry-server/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.857440 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.891598 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:21:39 crc kubenswrapper[4959]: I1007 15:21:39.933661 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:21:40 crc kubenswrapper[4959]: I1007 15:21:40.116056 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:21:40 crc kubenswrapper[4959]: I1007 15:21:40.165967 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:21:40 crc kubenswrapper[4959]: I1007 15:21:40.555595 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/registry-server/0.log" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.825485 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:01 crc kubenswrapper[4959]: E1007 15:22:01.826999 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82627dec-8c77-4ad8-b696-8c51d938272f" containerName="container-00" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.827016 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="82627dec-8c77-4ad8-b696-8c51d938272f" containerName="container-00" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.827246 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="82627dec-8c77-4ad8-b696-8c51d938272f" containerName="container-00" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.829106 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.857791 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.875325 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.875428 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wsbg\" (UniqueName: \"kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.875498 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.977387 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.977567 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.977664 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wsbg\" (UniqueName: \"kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.977967 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:01 crc kubenswrapper[4959]: I1007 15:22:01.978318 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:02 crc kubenswrapper[4959]: I1007 15:22:02.007091 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wsbg\" (UniqueName: \"kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg\") pod \"certified-operators-55www\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:02 crc kubenswrapper[4959]: I1007 15:22:02.154847 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:02 crc kubenswrapper[4959]: I1007 15:22:02.739937 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:03 crc kubenswrapper[4959]: I1007 15:22:03.377321 4959 generic.go:334] "Generic (PLEG): container finished" podID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerID="78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152" exitCode=0 Oct 07 15:22:03 crc kubenswrapper[4959]: I1007 15:22:03.377418 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerDied","Data":"78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152"} Oct 07 15:22:03 crc kubenswrapper[4959]: I1007 15:22:03.377817 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerStarted","Data":"83ebf53a1d98ee1d63bae5dc6641d220537dc25433175e0dfdea3c18ae38b783"} Oct 07 15:22:03 crc kubenswrapper[4959]: I1007 15:22:03.380530 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:22:05 crc kubenswrapper[4959]: I1007 15:22:05.412558 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerStarted","Data":"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49"} Oct 07 15:22:06 crc kubenswrapper[4959]: I1007 15:22:06.435935 4959 generic.go:334] "Generic (PLEG): container finished" podID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerID="bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49" exitCode=0 Oct 07 15:22:06 crc kubenswrapper[4959]: I1007 15:22:06.436478 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerDied","Data":"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49"} Oct 07 15:22:07 crc kubenswrapper[4959]: I1007 15:22:07.450930 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerStarted","Data":"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4"} Oct 07 15:22:07 crc kubenswrapper[4959]: I1007 15:22:07.479205 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55www" podStartSLOduration=2.931159044 podStartE2EDuration="6.479180228s" podCreationTimestamp="2025-10-07 15:22:01 +0000 UTC" firstStartedPulling="2025-10-07 15:22:03.380226586 +0000 UTC m=+8475.540949263" lastFinishedPulling="2025-10-07 15:22:06.92824777 +0000 UTC m=+8479.088970447" observedRunningTime="2025-10-07 15:22:07.472784545 +0000 UTC m=+8479.633507232" watchObservedRunningTime="2025-10-07 15:22:07.479180228 +0000 UTC m=+8479.639902905" Oct 07 15:22:12 crc kubenswrapper[4959]: I1007 15:22:12.155538 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:12 crc kubenswrapper[4959]: I1007 15:22:12.156285 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:12 crc kubenswrapper[4959]: I1007 15:22:12.215130 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:12 crc kubenswrapper[4959]: I1007 15:22:12.562640 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:12 crc kubenswrapper[4959]: I1007 15:22:12.623650 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:14 crc kubenswrapper[4959]: I1007 15:22:14.523164 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55www" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="registry-server" containerID="cri-o://fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4" gracePeriod=2 Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.068201 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.141278 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wsbg\" (UniqueName: \"kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg\") pod \"af568d11-8b61-42cf-acb9-ea2b7491dd64\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.141728 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities\") pod \"af568d11-8b61-42cf-acb9-ea2b7491dd64\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.142003 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content\") pod \"af568d11-8b61-42cf-acb9-ea2b7491dd64\" (UID: \"af568d11-8b61-42cf-acb9-ea2b7491dd64\") " Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.143021 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities" (OuterVolumeSpecName: "utilities") pod "af568d11-8b61-42cf-acb9-ea2b7491dd64" (UID: "af568d11-8b61-42cf-acb9-ea2b7491dd64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.153228 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg" (OuterVolumeSpecName: "kube-api-access-2wsbg") pod "af568d11-8b61-42cf-acb9-ea2b7491dd64" (UID: "af568d11-8b61-42cf-acb9-ea2b7491dd64"). InnerVolumeSpecName "kube-api-access-2wsbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.210608 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af568d11-8b61-42cf-acb9-ea2b7491dd64" (UID: "af568d11-8b61-42cf-acb9-ea2b7491dd64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.245004 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.245064 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af568d11-8b61-42cf-acb9-ea2b7491dd64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.245081 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wsbg\" (UniqueName: \"kubernetes.io/projected/af568d11-8b61-42cf-acb9-ea2b7491dd64-kube-api-access-2wsbg\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.543738 4959 generic.go:334] "Generic (PLEG): container finished" podID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerID="fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4" exitCode=0 Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.543805 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55www" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.543824 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerDied","Data":"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4"} Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.545200 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55www" event={"ID":"af568d11-8b61-42cf-acb9-ea2b7491dd64","Type":"ContainerDied","Data":"83ebf53a1d98ee1d63bae5dc6641d220537dc25433175e0dfdea3c18ae38b783"} Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.545223 4959 scope.go:117] "RemoveContainer" containerID="fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.583982 4959 scope.go:117] "RemoveContainer" containerID="bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.600948 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.633071 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55www"] Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.642986 4959 scope.go:117] "RemoveContainer" containerID="78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.671350 4959 scope.go:117] "RemoveContainer" containerID="fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4" Oct 07 15:22:15 crc kubenswrapper[4959]: E1007 15:22:15.674031 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4\": container with ID starting with fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4 not found: ID does not exist" containerID="fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.674083 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4"} err="failed to get container status \"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4\": rpc error: code = NotFound desc = could not find container \"fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4\": container with ID starting with fdad2d87b50a0db5e8ced3f81344bce06ad9467f7c49f5cf2e9504ee2235c4f4 not found: ID does not exist" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.674107 4959 scope.go:117] "RemoveContainer" containerID="bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49" Oct 07 15:22:15 crc kubenswrapper[4959]: E1007 15:22:15.674399 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49\": container with ID starting with bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49 not found: ID does not exist" containerID="bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.674426 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49"} err="failed to get container status \"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49\": rpc error: code = NotFound desc = could not find container \"bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49\": container with ID starting with bcd8b1c4f05c43dd5c69e42fd22cd3c85503fd27a44fd2d31ffe8d9066962d49 not found: ID does not exist" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.674446 4959 scope.go:117] "RemoveContainer" containerID="78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152" Oct 07 15:22:15 crc kubenswrapper[4959]: E1007 15:22:15.677370 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152\": container with ID starting with 78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152 not found: ID does not exist" containerID="78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152" Oct 07 15:22:15 crc kubenswrapper[4959]: I1007 15:22:15.677425 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152"} err="failed to get container status \"78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152\": rpc error: code = NotFound desc = could not find container \"78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152\": container with ID starting with 78f361d551bfe9ab03a4d42005e495b0aefa4020948b512e2b37867d04bcc152 not found: ID does not exist" Oct 07 15:22:16 crc kubenswrapper[4959]: I1007 15:22:16.835890 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" path="/var/lib/kubelet/pods/af568d11-8b61-42cf-acb9-ea2b7491dd64/volumes" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.127836 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:34 crc kubenswrapper[4959]: E1007 15:22:34.130296 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="extract-utilities" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.130330 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="extract-utilities" Oct 07 15:22:34 crc kubenswrapper[4959]: E1007 15:22:34.130387 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="extract-content" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.130398 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="extract-content" Oct 07 15:22:34 crc kubenswrapper[4959]: E1007 15:22:34.130436 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="registry-server" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.130447 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="registry-server" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.130698 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="af568d11-8b61-42cf-acb9-ea2b7491dd64" containerName="registry-server" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.132444 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.144586 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.221654 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.221786 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.221854 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzjj6\" (UniqueName: \"kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.324258 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.324435 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjj6\" (UniqueName: \"kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.324996 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.325130 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.325423 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.348964 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzjj6\" (UniqueName: \"kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6\") pod \"redhat-operators-xvppd\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.459913 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:34 crc kubenswrapper[4959]: E1007 15:22:34.810140 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:22:34 crc kubenswrapper[4959]: I1007 15:22:34.954076 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:35 crc kubenswrapper[4959]: I1007 15:22:35.808025 4959 generic.go:334] "Generic (PLEG): container finished" podID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerID="09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2" exitCode=0 Oct 07 15:22:35 crc kubenswrapper[4959]: I1007 15:22:35.808237 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerDied","Data":"09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2"} Oct 07 15:22:35 crc kubenswrapper[4959]: I1007 15:22:35.808583 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerStarted","Data":"2ab9068f6329a4bae696d1e4a0e807ebaaba5e240ae87872bf301f1021b2c202"} Oct 07 15:22:37 crc kubenswrapper[4959]: I1007 15:22:37.695950 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:22:37 crc kubenswrapper[4959]: I1007 15:22:37.696513 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:22:37 crc kubenswrapper[4959]: I1007 15:22:37.829798 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerStarted","Data":"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327"} Oct 07 15:22:39 crc kubenswrapper[4959]: I1007 15:22:39.858087 4959 generic.go:334] "Generic (PLEG): container finished" podID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerID="aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327" exitCode=0 Oct 07 15:22:39 crc kubenswrapper[4959]: I1007 15:22:39.858164 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerDied","Data":"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327"} Oct 07 15:22:40 crc kubenswrapper[4959]: I1007 15:22:40.869006 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerStarted","Data":"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd"} Oct 07 15:22:40 crc kubenswrapper[4959]: I1007 15:22:40.900411 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xvppd" podStartSLOduration=2.429258178 podStartE2EDuration="6.900392911s" podCreationTimestamp="2025-10-07 15:22:34 +0000 UTC" firstStartedPulling="2025-10-07 15:22:35.813319893 +0000 UTC m=+8507.974042570" lastFinishedPulling="2025-10-07 15:22:40.284454626 +0000 UTC m=+8512.445177303" observedRunningTime="2025-10-07 15:22:40.886815701 +0000 UTC m=+8513.047538388" watchObservedRunningTime="2025-10-07 15:22:40.900392911 +0000 UTC m=+8513.061115588" Oct 07 15:22:44 crc kubenswrapper[4959]: I1007 15:22:44.459997 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:44 crc kubenswrapper[4959]: I1007 15:22:44.461586 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:45 crc kubenswrapper[4959]: I1007 15:22:45.512330 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xvppd" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="registry-server" probeResult="failure" output=< Oct 07 15:22:45 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:22:45 crc kubenswrapper[4959]: > Oct 07 15:22:54 crc kubenswrapper[4959]: I1007 15:22:54.542263 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:54 crc kubenswrapper[4959]: I1007 15:22:54.610579 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:54 crc kubenswrapper[4959]: I1007 15:22:54.785009 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.023553 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xvppd" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="registry-server" containerID="cri-o://4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd" gracePeriod=2 Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.539038 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.565837 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content\") pod \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.566024 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzjj6\" (UniqueName: \"kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6\") pod \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.566088 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities\") pod \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\" (UID: \"6b3a4cb9-9025-404c-a1f1-98a262bf7df0\") " Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.567460 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities" (OuterVolumeSpecName: "utilities") pod "6b3a4cb9-9025-404c-a1f1-98a262bf7df0" (UID: "6b3a4cb9-9025-404c-a1f1-98a262bf7df0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.579136 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6" (OuterVolumeSpecName: "kube-api-access-bzjj6") pod "6b3a4cb9-9025-404c-a1f1-98a262bf7df0" (UID: "6b3a4cb9-9025-404c-a1f1-98a262bf7df0"). InnerVolumeSpecName "kube-api-access-bzjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.668897 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzjj6\" (UniqueName: \"kubernetes.io/projected/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-kube-api-access-bzjj6\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.668932 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.679946 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b3a4cb9-9025-404c-a1f1-98a262bf7df0" (UID: "6b3a4cb9-9025-404c-a1f1-98a262bf7df0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:56 crc kubenswrapper[4959]: I1007 15:22:56.771868 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3a4cb9-9025-404c-a1f1-98a262bf7df0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.037577 4959 generic.go:334] "Generic (PLEG): container finished" podID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerID="4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd" exitCode=0 Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.037663 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerDied","Data":"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd"} Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.037689 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvppd" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.037739 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvppd" event={"ID":"6b3a4cb9-9025-404c-a1f1-98a262bf7df0","Type":"ContainerDied","Data":"2ab9068f6329a4bae696d1e4a0e807ebaaba5e240ae87872bf301f1021b2c202"} Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.037763 4959 scope.go:117] "RemoveContainer" containerID="4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.071579 4959 scope.go:117] "RemoveContainer" containerID="aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.080336 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.092354 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xvppd"] Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.099805 4959 scope.go:117] "RemoveContainer" containerID="09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.157647 4959 scope.go:117] "RemoveContainer" containerID="4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd" Oct 07 15:22:57 crc kubenswrapper[4959]: E1007 15:22:57.158183 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd\": container with ID starting with 4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd not found: ID does not exist" containerID="4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.158237 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd"} err="failed to get container status \"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd\": rpc error: code = NotFound desc = could not find container \"4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd\": container with ID starting with 4ca7510420a6db40d50c4129b4e76fd189d0afe3fe2b3725c45a3cec878e44cd not found: ID does not exist" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.158272 4959 scope.go:117] "RemoveContainer" containerID="aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327" Oct 07 15:22:57 crc kubenswrapper[4959]: E1007 15:22:57.158701 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327\": container with ID starting with aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327 not found: ID does not exist" containerID="aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.158766 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327"} err="failed to get container status \"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327\": rpc error: code = NotFound desc = could not find container \"aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327\": container with ID starting with aa54203c8e647c0a52481b14173caf4d11d0d0898dce31e265465c82be290327 not found: ID does not exist" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.158804 4959 scope.go:117] "RemoveContainer" containerID="09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2" Oct 07 15:22:57 crc kubenswrapper[4959]: E1007 15:22:57.159100 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2\": container with ID starting with 09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2 not found: ID does not exist" containerID="09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2" Oct 07 15:22:57 crc kubenswrapper[4959]: I1007 15:22:57.159131 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2"} err="failed to get container status \"09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2\": rpc error: code = NotFound desc = could not find container \"09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2\": container with ID starting with 09081784c6a06a814450fbc52a25b0b50882b9cdfcdb382ea9bbc9234aca7cc2 not found: ID does not exist" Oct 07 15:22:58 crc kubenswrapper[4959]: I1007 15:22:58.841229 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" path="/var/lib/kubelet/pods/6b3a4cb9-9025-404c-a1f1-98a262bf7df0/volumes" Oct 07 15:23:07 crc kubenswrapper[4959]: I1007 15:23:07.695611 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:23:07 crc kubenswrapper[4959]: I1007 15:23:07.696231 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:23:37 crc kubenswrapper[4959]: I1007 15:23:37.696214 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:23:37 crc kubenswrapper[4959]: I1007 15:23:37.696787 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:23:37 crc kubenswrapper[4959]: I1007 15:23:37.696836 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 15:23:37 crc kubenswrapper[4959]: I1007 15:23:37.697647 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:23:37 crc kubenswrapper[4959]: I1007 15:23:37.697700 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14" gracePeriod=600 Oct 07 15:23:38 crc kubenswrapper[4959]: I1007 15:23:38.519187 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14" exitCode=0 Oct 07 15:23:38 crc kubenswrapper[4959]: I1007 15:23:38.519279 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14"} Oct 07 15:23:38 crc kubenswrapper[4959]: I1007 15:23:38.519926 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38"} Oct 07 15:23:38 crc kubenswrapper[4959]: I1007 15:23:38.519951 4959 scope.go:117] "RemoveContainer" containerID="cf3dc0ef3c6f765fcdc0b0ff336085d67cf65472ff77bd3cf59c2ee008c633b6" Oct 07 15:23:57 crc kubenswrapper[4959]: I1007 15:23:57.739769 4959 scope.go:117] "RemoveContainer" containerID="786e6c17e12ff45d3b00e6729032f6752683731786e8bd0d3bc20326fc055440" Oct 07 15:23:59 crc kubenswrapper[4959]: E1007 15:23:59.808827 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:24:36 crc kubenswrapper[4959]: I1007 15:24:36.178378 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerID="1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94" exitCode=0 Oct 07 15:24:36 crc kubenswrapper[4959]: I1007 15:24:36.179709 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k527r/must-gather-kkl29" event={"ID":"8b212cb1-0340-4e4b-8582-7b9cd9429869","Type":"ContainerDied","Data":"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94"} Oct 07 15:24:36 crc kubenswrapper[4959]: I1007 15:24:36.180182 4959 scope.go:117] "RemoveContainer" containerID="1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94" Oct 07 15:24:36 crc kubenswrapper[4959]: I1007 15:24:36.839815 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k527r_must-gather-kkl29_8b212cb1-0340-4e4b-8582-7b9cd9429869/gather/0.log" Oct 07 15:24:45 crc kubenswrapper[4959]: I1007 15:24:45.536248 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k527r/must-gather-kkl29"] Oct 07 15:24:45 crc kubenswrapper[4959]: I1007 15:24:45.537161 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k527r/must-gather-kkl29" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="copy" containerID="cri-o://612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc" gracePeriod=2 Oct 07 15:24:45 crc kubenswrapper[4959]: I1007 15:24:45.546290 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k527r/must-gather-kkl29"] Oct 07 15:24:45 crc kubenswrapper[4959]: I1007 15:24:45.966430 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k527r_must-gather-kkl29_8b212cb1-0340-4e4b-8582-7b9cd9429869/copy/0.log" Oct 07 15:24:45 crc kubenswrapper[4959]: I1007 15:24:45.967487 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.107699 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48cpg\" (UniqueName: \"kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg\") pod \"8b212cb1-0340-4e4b-8582-7b9cd9429869\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.107779 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output\") pod \"8b212cb1-0340-4e4b-8582-7b9cd9429869\" (UID: \"8b212cb1-0340-4e4b-8582-7b9cd9429869\") " Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.115690 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg" (OuterVolumeSpecName: "kube-api-access-48cpg") pod "8b212cb1-0340-4e4b-8582-7b9cd9429869" (UID: "8b212cb1-0340-4e4b-8582-7b9cd9429869"). InnerVolumeSpecName "kube-api-access-48cpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.212378 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48cpg\" (UniqueName: \"kubernetes.io/projected/8b212cb1-0340-4e4b-8582-7b9cd9429869-kube-api-access-48cpg\") on node \"crc\" DevicePath \"\"" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.296437 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8b212cb1-0340-4e4b-8582-7b9cd9429869" (UID: "8b212cb1-0340-4e4b-8582-7b9cd9429869"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.314568 4959 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b212cb1-0340-4e4b-8582-7b9cd9429869-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.331279 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k527r_must-gather-kkl29_8b212cb1-0340-4e4b-8582-7b9cd9429869/copy/0.log" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.331601 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerID="612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc" exitCode=143 Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.331676 4959 scope.go:117] "RemoveContainer" containerID="612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.331845 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k527r/must-gather-kkl29" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.382813 4959 scope.go:117] "RemoveContainer" containerID="1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.545064 4959 scope.go:117] "RemoveContainer" containerID="612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc" Oct 07 15:24:46 crc kubenswrapper[4959]: E1007 15:24:46.545830 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc\": container with ID starting with 612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc not found: ID does not exist" containerID="612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.545876 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc"} err="failed to get container status \"612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc\": rpc error: code = NotFound desc = could not find container \"612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc\": container with ID starting with 612ae0b932603a5f550fbded97cbf2528e648972d5046c3de712919048f9d0cc not found: ID does not exist" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.545907 4959 scope.go:117] "RemoveContainer" containerID="1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94" Oct 07 15:24:46 crc kubenswrapper[4959]: E1007 15:24:46.546295 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94\": container with ID starting with 1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94 not found: ID does not exist" containerID="1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.546336 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94"} err="failed to get container status \"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94\": rpc error: code = NotFound desc = could not find container \"1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94\": container with ID starting with 1de8111d16dde20c70633a111c4861e1b11bff3a5564b0b0d97dee9894193f94 not found: ID does not exist" Oct 07 15:24:46 crc kubenswrapper[4959]: I1007 15:24:46.828258 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" path="/var/lib/kubelet/pods/8b212cb1-0340-4e4b-8582-7b9cd9429869/volumes" Oct 07 15:25:25 crc kubenswrapper[4959]: E1007 15:25:25.810534 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.640196 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hc7gg/must-gather-n2n6r"] Oct 07 15:25:34 crc kubenswrapper[4959]: E1007 15:25:34.641814 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="extract-content" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.641836 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="extract-content" Oct 07 15:25:34 crc kubenswrapper[4959]: E1007 15:25:34.641860 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="gather" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.641870 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="gather" Oct 07 15:25:34 crc kubenswrapper[4959]: E1007 15:25:34.641911 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="extract-utilities" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.641921 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="extract-utilities" Oct 07 15:25:34 crc kubenswrapper[4959]: E1007 15:25:34.641935 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="registry-server" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.641943 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="registry-server" Oct 07 15:25:34 crc kubenswrapper[4959]: E1007 15:25:34.641971 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="copy" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.641979 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="copy" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.642263 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="gather" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.642294 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3a4cb9-9025-404c-a1f1-98a262bf7df0" containerName="registry-server" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.642311 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b212cb1-0340-4e4b-8582-7b9cd9429869" containerName="copy" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.644958 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.651109 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hc7gg"/"openshift-service-ca.crt" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.651778 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hc7gg"/"kube-root-ca.crt" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.656413 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hc7gg/must-gather-n2n6r"] Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.801302 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdvl\" (UniqueName: \"kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.801548 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.904322 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.904806 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdvl\" (UniqueName: \"kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.904943 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.928961 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdvl\" (UniqueName: \"kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl\") pod \"must-gather-n2n6r\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:34 crc kubenswrapper[4959]: I1007 15:25:34.970456 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:25:35 crc kubenswrapper[4959]: I1007 15:25:35.480139 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hc7gg/must-gather-n2n6r"] Oct 07 15:25:35 crc kubenswrapper[4959]: I1007 15:25:35.890587 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" event={"ID":"bcf6dce6-035d-4780-8841-08fa857032f9","Type":"ContainerStarted","Data":"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab"} Oct 07 15:25:35 crc kubenswrapper[4959]: I1007 15:25:35.891129 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" event={"ID":"bcf6dce6-035d-4780-8841-08fa857032f9","Type":"ContainerStarted","Data":"1505032930d3d4ed5a4887ed1139301abee4594d035e4630ceff9c4178cbd761"} Oct 07 15:25:36 crc kubenswrapper[4959]: I1007 15:25:36.903232 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" event={"ID":"bcf6dce6-035d-4780-8841-08fa857032f9","Type":"ContainerStarted","Data":"4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a"} Oct 07 15:25:36 crc kubenswrapper[4959]: I1007 15:25:36.930230 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" podStartSLOduration=2.9302015470000002 podStartE2EDuration="2.930201547s" podCreationTimestamp="2025-10-07 15:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:25:36.923045462 +0000 UTC m=+8689.083768139" watchObservedRunningTime="2025-10-07 15:25:36.930201547 +0000 UTC m=+8689.090924234" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.351287 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-r6gwc"] Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.353636 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.358686 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hc7gg"/"default-dockercfg-5lm25" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.455260 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59gqs\" (UniqueName: \"kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.455399 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.557862 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59gqs\" (UniqueName: \"kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.557989 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.558165 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.582944 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59gqs\" (UniqueName: \"kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs\") pod \"crc-debug-r6gwc\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.679847 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:25:40 crc kubenswrapper[4959]: W1007 15:25:40.750848 4959 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18acf442_45f3_47dc_90e0_1ae032f6de41.slice/crio-6eeaaf88ea762237809b4968b635a342244abbcca78dc6659da849555cdca2ba WatchSource:0}: Error finding container 6eeaaf88ea762237809b4968b635a342244abbcca78dc6659da849555cdca2ba: Status 404 returned error can't find the container with id 6eeaaf88ea762237809b4968b635a342244abbcca78dc6659da849555cdca2ba Oct 07 15:25:40 crc kubenswrapper[4959]: I1007 15:25:40.943267 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" event={"ID":"18acf442-45f3-47dc-90e0-1ae032f6de41","Type":"ContainerStarted","Data":"6eeaaf88ea762237809b4968b635a342244abbcca78dc6659da849555cdca2ba"} Oct 07 15:25:41 crc kubenswrapper[4959]: I1007 15:25:41.957652 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" event={"ID":"18acf442-45f3-47dc-90e0-1ae032f6de41","Type":"ContainerStarted","Data":"2943187e787f6d0bbcd4f9948fe9e9cbfde571dd76dcf7a9c494b8ad8d131829"} Oct 07 15:25:41 crc kubenswrapper[4959]: I1007 15:25:41.976377 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" podStartSLOduration=1.9763472499999999 podStartE2EDuration="1.97634725s" podCreationTimestamp="2025-10-07 15:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:25:41.973810567 +0000 UTC m=+8694.134533264" watchObservedRunningTime="2025-10-07 15:25:41.97634725 +0000 UTC m=+8694.137069927" Oct 07 15:26:07 crc kubenswrapper[4959]: I1007 15:26:07.696330 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:26:07 crc kubenswrapper[4959]: I1007 15:26:07.697213 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:26:33 crc kubenswrapper[4959]: E1007 15:26:33.809131 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:26:37 crc kubenswrapper[4959]: I1007 15:26:37.695788 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:26:37 crc kubenswrapper[4959]: I1007 15:26:37.696460 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:26:49 crc kubenswrapper[4959]: I1007 15:26:49.951961 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_9a227eb5-2c22-41c7-a0d8-a35d821c46e6/ansibletest-ansibletest/0.log" Oct 07 15:26:50 crc kubenswrapper[4959]: I1007 15:26:50.255248 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d8db6f568-8zwbx_80c6297a-2d51-4a7b-9da0-761f69d6f3b7/barbican-api/0.log" Oct 07 15:26:50 crc kubenswrapper[4959]: I1007 15:26:50.569241 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d8db6f568-8zwbx_80c6297a-2d51-4a7b-9da0-761f69d6f3b7/barbican-api-log/0.log" Oct 07 15:26:50 crc kubenswrapper[4959]: I1007 15:26:50.790211 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-57fd9f6674-4cfc2_97567312-2948-4f23-a1e5-da00d2689376/barbican-keystone-listener/0.log" Oct 07 15:26:51 crc kubenswrapper[4959]: I1007 15:26:51.346203 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-878d55485-gnqkk_68235903-6ab3-44c7-90a1-c49f473e4568/barbican-worker/0.log" Oct 07 15:26:51 crc kubenswrapper[4959]: I1007 15:26:51.437535 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-57fd9f6674-4cfc2_97567312-2948-4f23-a1e5-da00d2689376/barbican-keystone-listener-log/0.log" Oct 07 15:26:51 crc kubenswrapper[4959]: I1007 15:26:51.609209 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-878d55485-gnqkk_68235903-6ab3-44c7-90a1-c49f473e4568/barbican-worker-log/0.log" Oct 07 15:26:51 crc kubenswrapper[4959]: I1007 15:26:51.883389 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-w8wvz_7be1a560-abc0-4b57-a960-85019afbe322/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.121049 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/ceilometer-central-agent/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.295222 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/ceilometer-notification-agent/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.325086 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/proxy-httpd/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.468933 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3b0fa6d2-0f70-48bf-ba53-542df646b703/sg-core/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.653317 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-vdj42_bc073ac7-2aa0-4ce2-a335-90a7f9bb00b1/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:52 crc kubenswrapper[4959]: I1007 15:26:52.767580 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gkq7b_090ad048-3bec-4657-b329-1fbdba663340/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.049167 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_54a9118f-48be-4663-ba53-6e107a5d09e8/cinder-api-log/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.108595 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_54a9118f-48be-4663-ba53-6e107a5d09e8/cinder-api/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.392192 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3dcdce3c-0b57-4c61-84d9-61c99ba03314/probe/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.497053 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3dcdce3c-0b57-4c61-84d9-61c99ba03314/cinder-backup/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.659525 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96e0aa23-8c42-4616-af38-0eb612e5f181/cinder-scheduler/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.711911 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96e0aa23-8c42-4616-af38-0eb612e5f181/probe/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.931678 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_ef8431b3-9196-4986-aba7-43ffefa14817/cinder-volume/0.log" Oct 07 15:26:53 crc kubenswrapper[4959]: I1007 15:26:53.991046 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_ef8431b3-9196-4986-aba7-43ffefa14817/probe/0.log" Oct 07 15:26:54 crc kubenswrapper[4959]: I1007 15:26:54.142070 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wh847_2e1533f6-5266-414d-b116-f87c2acd344a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:54 crc kubenswrapper[4959]: I1007 15:26:54.260666 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vgg98_cffeb5da-ab9c-4c47-a6e2-2e647c4ac860/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:54 crc kubenswrapper[4959]: I1007 15:26:54.585157 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/init/0.log" Oct 07 15:26:54 crc kubenswrapper[4959]: I1007 15:26:54.864012 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/init/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.065273 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55d8975557-jwvh5_e8fbe198-197d-4725-acfc-c846f5b5c32a/dnsmasq-dns/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.125080 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77ff234e-dd31-4847-8517-4befe98845f7/glance-log/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.166319 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77ff234e-dd31-4847-8517-4befe98845f7/glance-httpd/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.394664 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cc40f402-6581-45a7-945f-a64d217724ab/glance-httpd/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.409437 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cc40f402-6581-45a7-945f-a64d217724ab/glance-log/0.log" Oct 07 15:26:55 crc kubenswrapper[4959]: I1007 15:26:55.737992 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc68dfcf6-xkrw7_41b4db91-ead3-4028-b30c-e3e726ae6f1e/horizon/0.log" Oct 07 15:26:56 crc kubenswrapper[4959]: I1007 15:26:56.028707 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_f07fca81-ce0b-4795-94ce-f4430d953e7a/horizontest-tests-horizontest/0.log" Oct 07 15:26:56 crc kubenswrapper[4959]: I1007 15:26:56.236131 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pmrjg_1c5e92bc-6eae-4ed1-81e8-400019fc8a13/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:56 crc kubenswrapper[4959]: I1007 15:26:56.500850 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7b997_5e262fa9-5abf-4283-99ed-ead5affb1282/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.006538 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc68dfcf6-xkrw7_41b4db91-ead3-4028-b30c-e3e726ae6f1e/horizon-log/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.111879 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330761-j7cjf_0fa0f0ea-e2fb-4b26-a9ea-24ec460e4e40/keystone-cron/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.342563 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330821-7g47d_95a09836-b1d0-4b20-8b66-13cadce981d6/keystone-cron/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.588594 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_44961788-4f6e-4912-a20e-4648a7760dce/kube-state-metrics/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.851220 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qtgrb_dadfbe5e-fc16-4c8f-8c46-9b56fb6801b3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:26:57 crc kubenswrapper[4959]: I1007 15:26:57.883223 4959 scope.go:117] "RemoveContainer" containerID="564ef2224ae46f7fc7ffc9a5f87a96162bd785a353142f80e0aedd81cec1942a" Oct 07 15:26:58 crc kubenswrapper[4959]: I1007 15:26:58.390138 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_640ef79d-5203-4f5e-8119-2f1eecb02bf1/manila-api-log/0.log" Oct 07 15:26:58 crc kubenswrapper[4959]: I1007 15:26:58.547687 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_640ef79d-5203-4f5e-8119-2f1eecb02bf1/manila-api/0.log" Oct 07 15:26:58 crc kubenswrapper[4959]: I1007 15:26:58.907877 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_45ac094a-4d13-4664-94e2-149bdb7b4548/probe/0.log" Oct 07 15:26:58 crc kubenswrapper[4959]: I1007 15:26:58.999299 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_45ac094a-4d13-4664-94e2-149bdb7b4548/manila-scheduler/0.log" Oct 07 15:26:59 crc kubenswrapper[4959]: I1007 15:26:59.072680 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-54f9969c74-l8zmx_969d49d0-51dc-47c4-a4fb-aba1b09f4a6a/keystone-api/0.log" Oct 07 15:26:59 crc kubenswrapper[4959]: I1007 15:26:59.299800 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8e08ebef-1b6b-4040-8b0f-7c841e191363/probe/0.log" Oct 07 15:26:59 crc kubenswrapper[4959]: I1007 15:26:59.330329 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8e08ebef-1b6b-4040-8b0f-7c841e191363/manila-share/0.log" Oct 07 15:27:01 crc kubenswrapper[4959]: I1007 15:27:01.089650 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-86cfbf9b4f-pxglw_46472ab2-866f-4b3c-b030-7b05d02f9176/neutron-httpd/0.log" Oct 07 15:27:01 crc kubenswrapper[4959]: I1007 15:27:01.577435 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ls5jx_fdb0f30e-5c49-4039-b739-ab6eb7eb7a8c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:01 crc kubenswrapper[4959]: I1007 15:27:01.690650 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-86cfbf9b4f-pxglw_46472ab2-866f-4b3c-b030-7b05d02f9176/neutron-api/0.log" Oct 07 15:27:06 crc kubenswrapper[4959]: I1007 15:27:06.431898 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e/nova-api-log/0.log" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.669658 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bcabf204-0890-4bfc-9a94-b921b3011603/nova-cell0-conductor-conductor/0.log" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.695191 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.695264 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.695331 4959 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.696508 4959 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38"} pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.696592 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" containerID="cri-o://8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" gracePeriod=600 Oct 07 15:27:07 crc kubenswrapper[4959]: E1007 15:27:07.843089 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.883792 4959 generic.go:334] "Generic (PLEG): container finished" podID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" exitCode=0 Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.883861 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerDied","Data":"8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38"} Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.883916 4959 scope.go:117] "RemoveContainer" containerID="1f81262f8f671a04434be7cd785ed736a33dcfb2d12c747b6f73ee2bb5bd4c14" Oct 07 15:27:07 crc kubenswrapper[4959]: I1007 15:27:07.885315 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:27:07 crc kubenswrapper[4959]: E1007 15:27:07.887079 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:27:08 crc kubenswrapper[4959]: I1007 15:27:08.116322 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bcc4aa01-2f10-4f3d-b7d0-fb60132a1c2e/nova-api-api/0.log" Oct 07 15:27:08 crc kubenswrapper[4959]: I1007 15:27:08.158964 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8446fa80-aebe-45ba-a6a7-4f51402f3d38/nova-cell1-conductor-conductor/0.log" Oct 07 15:27:08 crc kubenswrapper[4959]: I1007 15:27:08.460322 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9addbd40-1800-4967-bb06-7a90697034dd/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 15:27:08 crc kubenswrapper[4959]: I1007 15:27:08.516672 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-rfzqw_0ebc66fe-ebad-47d5-93df-fbff665959d9/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:08 crc kubenswrapper[4959]: I1007 15:27:08.787108 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_627022b9-2219-4bcd-a001-53bf9e863c14/nova-metadata-log/0.log" Oct 07 15:27:09 crc kubenswrapper[4959]: I1007 15:27:09.618924 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/mysql-bootstrap/0.log" Oct 07 15:27:09 crc kubenswrapper[4959]: I1007 15:27:09.876188 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/mysql-bootstrap/0.log" Oct 07 15:27:09 crc kubenswrapper[4959]: I1007 15:27:09.888937 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a46a9782-e96e-432c-b2e8-c7863291485e/nova-scheduler-scheduler/0.log" Oct 07 15:27:10 crc kubenswrapper[4959]: I1007 15:27:10.138288 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fed91ea6-e906-47c4-84e0-123c01a9780d/galera/0.log" Oct 07 15:27:10 crc kubenswrapper[4959]: I1007 15:27:10.401003 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/mysql-bootstrap/0.log" Oct 07 15:27:10 crc kubenswrapper[4959]: I1007 15:27:10.654108 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/mysql-bootstrap/0.log" Oct 07 15:27:10 crc kubenswrapper[4959]: I1007 15:27:10.805517 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e980567-4b6d-474f-ae89-3dc436ebf1a5/galera/0.log" Oct 07 15:27:11 crc kubenswrapper[4959]: I1007 15:27:11.022852 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_eab0abf5-c944-4a5c-9259-6dc0ea2b115f/openstackclient/0.log" Oct 07 15:27:11 crc kubenswrapper[4959]: I1007 15:27:11.245221 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lqwt9_dde9002d-236f-4dc3-947e-98e1e4e535c1/openstack-network-exporter/0.log" Oct 07 15:27:11 crc kubenswrapper[4959]: I1007 15:27:11.579723 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server-init/0.log" Oct 07 15:27:11 crc kubenswrapper[4959]: I1007 15:27:11.858344 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server-init/0.log" Oct 07 15:27:11 crc kubenswrapper[4959]: I1007 15:27:11.875966 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovs-vswitchd/0.log" Oct 07 15:27:12 crc kubenswrapper[4959]: I1007 15:27:12.072767 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4nl8g_ba866f69-2f83-4b66-b1af-693f07c437e0/ovsdb-server/0.log" Oct 07 15:27:12 crc kubenswrapper[4959]: I1007 15:27:12.358153 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-z8f9v_907772e5-2f0c-4478-9d3b-8f82eec8f258/ovn-controller/0.log" Oct 07 15:27:12 crc kubenswrapper[4959]: I1007 15:27:12.594062 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-w2tmp_153879ad-6c45-43f1-a7e7-6b7e2f4e8cf7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:12 crc kubenswrapper[4959]: I1007 15:27:12.847582 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b18eda78-12ab-4cb2-ac1c-56907a2b4667/openstack-network-exporter/0.log" Oct 07 15:27:12 crc kubenswrapper[4959]: I1007 15:27:12.917667 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b18eda78-12ab-4cb2-ac1c-56907a2b4667/ovn-northd/0.log" Oct 07 15:27:13 crc kubenswrapper[4959]: I1007 15:27:13.189889 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_26d915bf-8d27-4349-9a3b-f13f13809cf5/openstack-network-exporter/0.log" Oct 07 15:27:13 crc kubenswrapper[4959]: I1007 15:27:13.346369 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_26d915bf-8d27-4349-9a3b-f13f13809cf5/ovsdbserver-nb/0.log" Oct 07 15:27:13 crc kubenswrapper[4959]: I1007 15:27:13.607331 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_151d32f4-496d-43a0-aeb7-ee999d5faeef/openstack-network-exporter/0.log" Oct 07 15:27:13 crc kubenswrapper[4959]: I1007 15:27:13.666979 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_627022b9-2219-4bcd-a001-53bf9e863c14/nova-metadata-metadata/0.log" Oct 07 15:27:13 crc kubenswrapper[4959]: I1007 15:27:13.729071 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_151d32f4-496d-43a0-aeb7-ee999d5faeef/ovsdbserver-sb/0.log" Oct 07 15:27:14 crc kubenswrapper[4959]: I1007 15:27:14.519484 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/setup-container/0.log" Oct 07 15:27:14 crc kubenswrapper[4959]: I1007 15:27:14.531274 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58674f758b-wncml_ef3ca2a1-1eed-47fc-8454-47decce134d5/placement-api/0.log" Oct 07 15:27:14 crc kubenswrapper[4959]: I1007 15:27:14.744060 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/setup-container/0.log" Oct 07 15:27:14 crc kubenswrapper[4959]: I1007 15:27:14.792096 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58674f758b-wncml_ef3ca2a1-1eed-47fc-8454-47decce134d5/placement-log/0.log" Oct 07 15:27:14 crc kubenswrapper[4959]: I1007 15:27:14.860754 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bd08ab8c-e65e-4ca6-8cd3-a62bec086bb1/rabbitmq/0.log" Oct 07 15:27:15 crc kubenswrapper[4959]: I1007 15:27:15.055879 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/setup-container/0.log" Oct 07 15:27:15 crc kubenswrapper[4959]: I1007 15:27:15.350474 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/setup-container/0.log" Oct 07 15:27:15 crc kubenswrapper[4959]: I1007 15:27:15.431216 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_52260e60-f3cc-46d0-b7ce-0424500d0573/rabbitmq/0.log" Oct 07 15:27:15 crc kubenswrapper[4959]: I1007 15:27:15.662721 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2bzdv_1d2aa3cc-f250-4d9e-b6da-921018115809/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:15 crc kubenswrapper[4959]: I1007 15:27:15.728372 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gcrbs_1522ab05-1ecc-4aad-8196-557397dd2ebf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:16 crc kubenswrapper[4959]: I1007 15:27:16.021039 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-t4pk5_feecb62b-99f0-41a7-80ce-3e8538801512/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:16 crc kubenswrapper[4959]: I1007 15:27:16.583731 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5cbpb_e4d99350-2d4f-451a-a539-e7a72f41ad3a/ssh-known-hosts-edpm-deployment/0.log" Oct 07 15:27:16 crc kubenswrapper[4959]: I1007 15:27:16.728093 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_b14b7636-6093-478a-945a-a512ef1935b4/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:27:16 crc kubenswrapper[4959]: I1007 15:27:16.807334 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_9579dcb2-a7ed-4955-83e6-2b8a2d2ffec8/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.038012 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_4cf18b5e-3c67-4d0f-a2d1-9eb6bf035825/test-operator-logs-container/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.235356 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_8ddae6ed-5ca7-45f7-bf73-afea2af7d7de/test-operator-logs-container/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.354410 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2611594a-b816-4cdc-b55b-d6ac6e281071/test-operator-logs-container/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.511159 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_a85e3cec-d699-4a9f-9da3-809799b06f1c/test-operator-logs-container/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.773388 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_36d2c381-4cb1-4b35-b315-d1d4847f70c7/tobiko-tests-tobiko/0.log" Oct 07 15:27:17 crc kubenswrapper[4959]: I1007 15:27:17.901779 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_6d961e86-0037-4c2a-ac1f-b73c10339406/tobiko-tests-tobiko/0.log" Oct 07 15:27:18 crc kubenswrapper[4959]: I1007 15:27:18.092740 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mdwrs_0d3a592c-85aa-455c-a39e-cf2ec5c1f292/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:27:18 crc kubenswrapper[4959]: I1007 15:27:18.818038 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:27:18 crc kubenswrapper[4959]: E1007 15:27:18.818403 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:27:25 crc kubenswrapper[4959]: I1007 15:27:25.057738 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_72f6396e-c1ff-485b-8878-33f9ab5dc874/memcached/0.log" Oct 07 15:27:31 crc kubenswrapper[4959]: I1007 15:27:31.809761 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:27:31 crc kubenswrapper[4959]: E1007 15:27:31.810860 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:27:42 crc kubenswrapper[4959]: I1007 15:27:42.809577 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:27:42 crc kubenswrapper[4959]: E1007 15:27:42.810791 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:27:53 crc kubenswrapper[4959]: I1007 15:27:53.809237 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:27:53 crc kubenswrapper[4959]: E1007 15:27:53.811148 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:28:03 crc kubenswrapper[4959]: E1007 15:28:03.810137 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:28:08 crc kubenswrapper[4959]: I1007 15:28:08.818453 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:28:08 crc kubenswrapper[4959]: E1007 15:28:08.820046 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:28:10 crc kubenswrapper[4959]: I1007 15:28:10.610726 4959 generic.go:334] "Generic (PLEG): container finished" podID="18acf442-45f3-47dc-90e0-1ae032f6de41" containerID="2943187e787f6d0bbcd4f9948fe9e9cbfde571dd76dcf7a9c494b8ad8d131829" exitCode=0 Oct 07 15:28:10 crc kubenswrapper[4959]: I1007 15:28:10.610796 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" event={"ID":"18acf442-45f3-47dc-90e0-1ae032f6de41","Type":"ContainerDied","Data":"2943187e787f6d0bbcd4f9948fe9e9cbfde571dd76dcf7a9c494b8ad8d131829"} Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.344861 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.401864 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-r6gwc"] Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.403642 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59gqs\" (UniqueName: \"kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs\") pod \"18acf442-45f3-47dc-90e0-1ae032f6de41\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.403995 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host\") pod \"18acf442-45f3-47dc-90e0-1ae032f6de41\" (UID: \"18acf442-45f3-47dc-90e0-1ae032f6de41\") " Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.404762 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host" (OuterVolumeSpecName: "host") pod "18acf442-45f3-47dc-90e0-1ae032f6de41" (UID: "18acf442-45f3-47dc-90e0-1ae032f6de41"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.412333 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-r6gwc"] Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.414950 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs" (OuterVolumeSpecName: "kube-api-access-59gqs") pod "18acf442-45f3-47dc-90e0-1ae032f6de41" (UID: "18acf442-45f3-47dc-90e0-1ae032f6de41"). InnerVolumeSpecName "kube-api-access-59gqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.507169 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59gqs\" (UniqueName: \"kubernetes.io/projected/18acf442-45f3-47dc-90e0-1ae032f6de41-kube-api-access-59gqs\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.507222 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18acf442-45f3-47dc-90e0-1ae032f6de41-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.635878 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eeaaf88ea762237809b4968b635a342244abbcca78dc6659da849555cdca2ba" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.636052 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-r6gwc" Oct 07 15:28:12 crc kubenswrapper[4959]: I1007 15:28:12.822804 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18acf442-45f3-47dc-90e0-1ae032f6de41" path="/var/lib/kubelet/pods/18acf442-45f3-47dc-90e0-1ae032f6de41/volumes" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.633731 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-p52ch"] Oct 07 15:28:13 crc kubenswrapper[4959]: E1007 15:28:13.634928 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18acf442-45f3-47dc-90e0-1ae032f6de41" containerName="container-00" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.634949 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="18acf442-45f3-47dc-90e0-1ae032f6de41" containerName="container-00" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.635336 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="18acf442-45f3-47dc-90e0-1ae032f6de41" containerName="container-00" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.636579 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.638935 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hc7gg"/"default-dockercfg-5lm25" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.736987 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2gsq\" (UniqueName: \"kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.737470 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.840321 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2gsq\" (UniqueName: \"kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.840751 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.840963 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.860776 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2gsq\" (UniqueName: \"kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq\") pod \"crc-debug-p52ch\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:13 crc kubenswrapper[4959]: I1007 15:28:13.959758 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:14 crc kubenswrapper[4959]: I1007 15:28:14.656799 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" event={"ID":"5348d643-c8d4-44f3-af6e-10088660bdf8","Type":"ContainerStarted","Data":"7666ef6e0509674a419c2d3374f02cd59705edd5b00b6b5df19c9cc1b2c732c2"} Oct 07 15:28:14 crc kubenswrapper[4959]: I1007 15:28:14.657251 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" event={"ID":"5348d643-c8d4-44f3-af6e-10088660bdf8","Type":"ContainerStarted","Data":"c33f9f9a0a32be0d60d5af9917ff2bae76ded706cd93bf7960f98ea6471916e0"} Oct 07 15:28:15 crc kubenswrapper[4959]: I1007 15:28:15.670783 4959 generic.go:334] "Generic (PLEG): container finished" podID="5348d643-c8d4-44f3-af6e-10088660bdf8" containerID="7666ef6e0509674a419c2d3374f02cd59705edd5b00b6b5df19c9cc1b2c732c2" exitCode=0 Oct 07 15:28:15 crc kubenswrapper[4959]: I1007 15:28:15.670910 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" event={"ID":"5348d643-c8d4-44f3-af6e-10088660bdf8","Type":"ContainerDied","Data":"7666ef6e0509674a419c2d3374f02cd59705edd5b00b6b5df19c9cc1b2c732c2"} Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.800958 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.909923 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host\") pod \"5348d643-c8d4-44f3-af6e-10088660bdf8\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.911258 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2gsq\" (UniqueName: \"kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq\") pod \"5348d643-c8d4-44f3-af6e-10088660bdf8\" (UID: \"5348d643-c8d4-44f3-af6e-10088660bdf8\") " Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.910070 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host" (OuterVolumeSpecName: "host") pod "5348d643-c8d4-44f3-af6e-10088660bdf8" (UID: "5348d643-c8d4-44f3-af6e-10088660bdf8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.914574 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5348d643-c8d4-44f3-af6e-10088660bdf8-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:16 crc kubenswrapper[4959]: I1007 15:28:16.919467 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq" (OuterVolumeSpecName: "kube-api-access-t2gsq") pod "5348d643-c8d4-44f3-af6e-10088660bdf8" (UID: "5348d643-c8d4-44f3-af6e-10088660bdf8"). InnerVolumeSpecName "kube-api-access-t2gsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:28:17 crc kubenswrapper[4959]: I1007 15:28:17.016050 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2gsq\" (UniqueName: \"kubernetes.io/projected/5348d643-c8d4-44f3-af6e-10088660bdf8-kube-api-access-t2gsq\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:17 crc kubenswrapper[4959]: I1007 15:28:17.697333 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" event={"ID":"5348d643-c8d4-44f3-af6e-10088660bdf8","Type":"ContainerDied","Data":"c33f9f9a0a32be0d60d5af9917ff2bae76ded706cd93bf7960f98ea6471916e0"} Oct 07 15:28:17 crc kubenswrapper[4959]: I1007 15:28:17.697401 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c33f9f9a0a32be0d60d5af9917ff2bae76ded706cd93bf7960f98ea6471916e0" Oct 07 15:28:17 crc kubenswrapper[4959]: I1007 15:28:17.697510 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-p52ch" Oct 07 15:28:19 crc kubenswrapper[4959]: I1007 15:28:19.812639 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:28:19 crc kubenswrapper[4959]: E1007 15:28:19.813858 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.201848 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:20 crc kubenswrapper[4959]: E1007 15:28:20.202362 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348d643-c8d4-44f3-af6e-10088660bdf8" containerName="container-00" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.202384 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348d643-c8d4-44f3-af6e-10088660bdf8" containerName="container-00" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.202583 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="5348d643-c8d4-44f3-af6e-10088660bdf8" containerName="container-00" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.204170 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.212373 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.284968 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.285073 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8l7n\" (UniqueName: \"kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.285224 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.387160 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.387258 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8l7n\" (UniqueName: \"kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.387357 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.389280 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.389695 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.420398 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8l7n\" (UniqueName: \"kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n\") pod \"community-operators-vcgpl\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:20 crc kubenswrapper[4959]: I1007 15:28:20.542552 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:21 crc kubenswrapper[4959]: I1007 15:28:21.226258 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:21 crc kubenswrapper[4959]: I1007 15:28:21.849182 4959 generic.go:334] "Generic (PLEG): container finished" podID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerID="27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683" exitCode=0 Oct 07 15:28:21 crc kubenswrapper[4959]: I1007 15:28:21.849236 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerDied","Data":"27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683"} Oct 07 15:28:21 crc kubenswrapper[4959]: I1007 15:28:21.849492 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerStarted","Data":"36ee4632af7d22eacd07d343dedf47f35845688ea3ac10fcb4ff9f8adafdddc0"} Oct 07 15:28:21 crc kubenswrapper[4959]: I1007 15:28:21.854723 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:28:23 crc kubenswrapper[4959]: I1007 15:28:23.892452 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerStarted","Data":"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6"} Oct 07 15:28:26 crc kubenswrapper[4959]: I1007 15:28:26.614882 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-p52ch"] Oct 07 15:28:26 crc kubenswrapper[4959]: I1007 15:28:26.622848 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-p52ch"] Oct 07 15:28:26 crc kubenswrapper[4959]: I1007 15:28:26.829388 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5348d643-c8d4-44f3-af6e-10088660bdf8" path="/var/lib/kubelet/pods/5348d643-c8d4-44f3-af6e-10088660bdf8/volumes" Oct 07 15:28:26 crc kubenswrapper[4959]: I1007 15:28:26.920547 4959 generic.go:334] "Generic (PLEG): container finished" podID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerID="0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6" exitCode=0 Oct 07 15:28:26 crc kubenswrapper[4959]: I1007 15:28:26.920605 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerDied","Data":"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6"} Oct 07 15:28:27 crc kubenswrapper[4959]: I1007 15:28:27.839429 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-75t54"] Oct 07 15:28:27 crc kubenswrapper[4959]: I1007 15:28:27.841915 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:27 crc kubenswrapper[4959]: I1007 15:28:27.844774 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hc7gg"/"default-dockercfg-5lm25" Oct 07 15:28:27 crc kubenswrapper[4959]: I1007 15:28:27.971163 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:27 crc kubenswrapper[4959]: I1007 15:28:27.971567 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8ms\" (UniqueName: \"kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.074558 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.074743 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.074756 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8ms\" (UniqueName: \"kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.411715 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8ms\" (UniqueName: \"kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms\") pod \"crc-debug-75t54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.475475 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:28 crc kubenswrapper[4959]: I1007 15:28:28.942063 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-75t54" event={"ID":"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54","Type":"ContainerStarted","Data":"6c31c269ce4671f8fb57acc399d0d59d2d7b54bfe46977df09ecd6f2778a836e"} Oct 07 15:28:29 crc kubenswrapper[4959]: I1007 15:28:29.955425 4959 generic.go:334] "Generic (PLEG): container finished" podID="788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" containerID="cb7042a03c9901f0ec039baa499737f84083376194d0307c809042e7fce8c273" exitCode=0 Oct 07 15:28:29 crc kubenswrapper[4959]: I1007 15:28:29.955622 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/crc-debug-75t54" event={"ID":"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54","Type":"ContainerDied","Data":"cb7042a03c9901f0ec039baa499737f84083376194d0307c809042e7fce8c273"} Oct 07 15:28:29 crc kubenswrapper[4959]: I1007 15:28:29.960831 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerStarted","Data":"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9"} Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.011271 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcgpl" podStartSLOduration=3.225134432 podStartE2EDuration="10.01124185s" podCreationTimestamp="2025-10-07 15:28:20 +0000 UTC" firstStartedPulling="2025-10-07 15:28:21.854407628 +0000 UTC m=+8854.015130305" lastFinishedPulling="2025-10-07 15:28:28.640515046 +0000 UTC m=+8860.801237723" observedRunningTime="2025-10-07 15:28:29.998434523 +0000 UTC m=+8862.159157210" watchObservedRunningTime="2025-10-07 15:28:30.01124185 +0000 UTC m=+8862.171964527" Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.046579 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-75t54"] Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.058106 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hc7gg/crc-debug-75t54"] Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.544975 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.545081 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:30 crc kubenswrapper[4959]: I1007 15:28:30.810001 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:28:30 crc kubenswrapper[4959]: E1007 15:28:30.810349 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.087749 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.204169 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host\") pod \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.204346 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host" (OuterVolumeSpecName: "host") pod "788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" (UID: "788d2d4c-be2a-4f3a-bd40-9ac041bc5f54"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.204387 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j8ms\" (UniqueName: \"kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms\") pod \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\" (UID: \"788d2d4c-be2a-4f3a-bd40-9ac041bc5f54\") " Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.205814 4959 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.212812 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms" (OuterVolumeSpecName: "kube-api-access-6j8ms") pod "788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" (UID: "788d2d4c-be2a-4f3a-bd40-9ac041bc5f54"). InnerVolumeSpecName "kube-api-access-6j8ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.308865 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j8ms\" (UniqueName: \"kubernetes.io/projected/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54-kube-api-access-6j8ms\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.606981 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vcgpl" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="registry-server" probeResult="failure" output=< Oct 07 15:28:31 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:28:31 crc kubenswrapper[4959]: > Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.984821 4959 scope.go:117] "RemoveContainer" containerID="cb7042a03c9901f0ec039baa499737f84083376194d0307c809042e7fce8c273" Oct 07 15:28:31 crc kubenswrapper[4959]: I1007 15:28:31.985039 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/crc-debug-75t54" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.201645 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.453808 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.482804 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.496550 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.709228 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/pull/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.730597 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/extract/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.736918 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_563473821d46f325f62a2983c417e07a5da08327e4dc09e802caf673b08n7gr_1d76c837-d256-4ea9-a23f-e55ee516e726/util/0.log" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.833746 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" path="/var/lib/kubelet/pods/788d2d4c-be2a-4f3a-bd40-9ac041bc5f54/volumes" Oct 07 15:28:32 crc kubenswrapper[4959]: I1007 15:28:32.949727 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f56ff694-b4rhk_c2a805f1-946a-4b48-9e52-4f24b56bd43a/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.037031 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f56ff694-b4rhk_c2a805f1-946a-4b48-9e52-4f24b56bd43a/manager/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.049463 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-zl4v9_3e59fb97-6ef4-42a5-a264-506bdccd8a23/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.239516 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-zl4v9_3e59fb97-6ef4-42a5-a264-506bdccd8a23/manager/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.248592 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-xmpkg_01b13867-f984-4d88-af12-28fc3ebc0b9f/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.285416 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-xmpkg_01b13867-f984-4d88-af12-28fc3ebc0b9f/manager/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.454199 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-fd648f65-rmk5h_035c3aeb-396b-47bf-a588-562bb0f27f88/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.533547 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-fd648f65-rmk5h_035c3aeb-396b-47bf-a588-562bb0f27f88/manager/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.704724 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7ccfc8cf49-d4g6g_4294ed44-d412-4366-959e-cb534ab792bc/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.808896 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7ccfc8cf49-d4g6g_4294ed44-d412-4366-959e-cb534ab792bc/manager/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.865105 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b477879bc-nf6mt_539702ff-226a-4c31-b715-af9af8ae1205/kube-rbac-proxy/0.log" Oct 07 15:28:33 crc kubenswrapper[4959]: I1007 15:28:33.977392 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b477879bc-nf6mt_539702ff-226a-4c31-b715-af9af8ae1205/manager/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.146114 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-d772s_e3213b12-9128-4d7c-8ec8-a731e6627de4/kube-rbac-proxy/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.246619 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-d772s_e3213b12-9128-4d7c-8ec8-a731e6627de4/manager/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.315400 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5467f8988c-6t98f_749f8ff6-9e1c-45ef-948f-1f8c255b670e/kube-rbac-proxy/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.425049 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5467f8988c-6t98f_749f8ff6-9e1c-45ef-948f-1f8c255b670e/manager/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.467987 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5b84cc7657-r57lc_77bcfec2-4667-4415-af5e-3009e5ea4999/kube-rbac-proxy/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.638789 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5b84cc7657-r57lc_77bcfec2-4667-4415-af5e-3009e5ea4999/manager/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.730215 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-26qz6_6e224af6-7095-4878-ba65-3a8e3f358968/kube-rbac-proxy/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.770636 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-26qz6_6e224af6-7095-4878-ba65-3a8e3f358968/manager/0.log" Oct 07 15:28:34 crc kubenswrapper[4959]: I1007 15:28:34.953114 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-6kq7p_5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85/kube-rbac-proxy/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.002813 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-6kq7p_5f03fbe8-3b8d-43eb-8c10-0c63f2a67d85/manager/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.163283 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-6tcd6_eea6d6d3-ded0-4788-8901-34c02d659aee/kube-rbac-proxy/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.216472 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-6tcd6_eea6d6d3-ded0-4788-8901-34c02d659aee/manager/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.273843 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-t2wjn_64d46cbf-e1a4-4673-9f0e-01371175a1f9/kube-rbac-proxy/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.487820 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-t2wjn_64d46cbf-e1a4-4673-9f0e-01371175a1f9/manager/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.567280 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-sfhzx_171d0807-668d-4284-ab63-698401676fbe/kube-rbac-proxy/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.567296 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-sfhzx_171d0807-668d-4284-ab63-698401676fbe/manager/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.742081 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw_0e606b13-be7c-4699-bb4b-5c50ddf32426/kube-rbac-proxy/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.759003 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dcdb4fdb8ndwzw_0e606b13-be7c-4699-bb4b-5c50ddf32426/manager/0.log" Oct 07 15:28:35 crc kubenswrapper[4959]: I1007 15:28:35.789226 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fd79fd9-ktzrv_370cd57f-855c-4584-a0c1-c806f93bd8d7/kube-rbac-proxy/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.056924 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86c7c896d7-mzwlr_88b5404c-1e9b-42c9-9c21-fb32b136db86/operator/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.062752 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86c7c896d7-mzwlr_88b5404c-1e9b-42c9-9c21-fb32b136db86/kube-rbac-proxy/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.377377 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54d485fd9-2vwpz_f8ddf44b-e556-40c6-a3f8-699d756434dd/kube-rbac-proxy/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.523267 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vptcw_8a32157a-8fdd-4430-9d22-3401166e4352/registry-server/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.618838 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54d485fd9-2vwpz_f8ddf44b-e556-40c6-a3f8-699d756434dd/manager/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.669046 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-nkfnv_d8ff35a5-f26c-4077-bdad-baa63159c6e4/kube-rbac-proxy/0.log" Oct 07 15:28:36 crc kubenswrapper[4959]: I1007 15:28:36.858148 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-nkfnv_d8ff35a5-f26c-4077-bdad-baa63159c6e4/manager/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.036293 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-pjlcp_1f48e97d-5d4f-49d3-b550-d51242109806/operator/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.165250 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-k6btb_86969b11-9037-4890-93dc-575b83669d0f/kube-rbac-proxy/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.271137 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-k6btb_86969b11-9037-4890-93dc-575b83669d0f/manager/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.309323 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-sp68w_cac1fe47-f06a-44fb-b4fe-a19faa802cca/kube-rbac-proxy/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.355009 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fd79fd9-ktzrv_370cd57f-855c-4584-a0c1-c806f93bd8d7/manager/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.504551 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-sp68w_cac1fe47-f06a-44fb-b4fe-a19faa802cca/manager/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.579979 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55c6894594-6pn9s_6108c0b3-e7a9-412c-9085-0eea09f342c6/kube-rbac-proxy/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.584270 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55c6894594-6pn9s_6108c0b3-e7a9-412c-9085-0eea09f342c6/manager/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.730074 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-rgw4d_1062e16d-6129-48d2-a385-d988ac5fe4f7/kube-rbac-proxy/0.log" Oct 07 15:28:37 crc kubenswrapper[4959]: I1007 15:28:37.799567 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-rgw4d_1062e16d-6129-48d2-a385-d988ac5fe4f7/manager/0.log" Oct 07 15:28:40 crc kubenswrapper[4959]: I1007 15:28:40.622438 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:40 crc kubenswrapper[4959]: I1007 15:28:40.700724 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:40 crc kubenswrapper[4959]: I1007 15:28:40.870675 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.098012 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vcgpl" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="registry-server" containerID="cri-o://a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9" gracePeriod=2 Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.672954 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.795443 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities\") pod \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.796123 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8l7n\" (UniqueName: \"kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n\") pod \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.796193 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content\") pod \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\" (UID: \"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7\") " Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.798047 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities" (OuterVolumeSpecName: "utilities") pod "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" (UID: "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.810719 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n" (OuterVolumeSpecName: "kube-api-access-l8l7n") pod "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" (UID: "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7"). InnerVolumeSpecName "kube-api-access-l8l7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.852219 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" (UID: "56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.899283 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8l7n\" (UniqueName: \"kubernetes.io/projected/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-kube-api-access-l8l7n\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.899544 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:42 crc kubenswrapper[4959]: I1007 15:28:42.899615 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.109056 4959 generic.go:334] "Generic (PLEG): container finished" podID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerID="a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9" exitCode=0 Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.109104 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerDied","Data":"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9"} Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.109133 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgpl" event={"ID":"56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7","Type":"ContainerDied","Data":"36ee4632af7d22eacd07d343dedf47f35845688ea3ac10fcb4ff9f8adafdddc0"} Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.109154 4959 scope.go:117] "RemoveContainer" containerID="a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.109276 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgpl" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.148329 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.149415 4959 scope.go:117] "RemoveContainer" containerID="0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.161464 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vcgpl"] Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.170004 4959 scope.go:117] "RemoveContainer" containerID="27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.221679 4959 scope.go:117] "RemoveContainer" containerID="a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9" Oct 07 15:28:43 crc kubenswrapper[4959]: E1007 15:28:43.222385 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9\": container with ID starting with a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9 not found: ID does not exist" containerID="a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.222474 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9"} err="failed to get container status \"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9\": rpc error: code = NotFound desc = could not find container \"a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9\": container with ID starting with a5fe649cb5f08c964837837e593be9314c615cb9d4bb123e54f6ad5d9128f1d9 not found: ID does not exist" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.222515 4959 scope.go:117] "RemoveContainer" containerID="0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6" Oct 07 15:28:43 crc kubenswrapper[4959]: E1007 15:28:43.223057 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6\": container with ID starting with 0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6 not found: ID does not exist" containerID="0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.223133 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6"} err="failed to get container status \"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6\": rpc error: code = NotFound desc = could not find container \"0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6\": container with ID starting with 0a34b970da6b3a79516fff38c1acc44f3e6c5ab6c3fbe58fb32e18abf77f7dd6 not found: ID does not exist" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.223180 4959 scope.go:117] "RemoveContainer" containerID="27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683" Oct 07 15:28:43 crc kubenswrapper[4959]: E1007 15:28:43.223593 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683\": container with ID starting with 27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683 not found: ID does not exist" containerID="27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683" Oct 07 15:28:43 crc kubenswrapper[4959]: I1007 15:28:43.223647 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683"} err="failed to get container status \"27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683\": rpc error: code = NotFound desc = could not find container \"27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683\": container with ID starting with 27c857c8ef9d177e17f3ec830e47f665aaef9bfef5d371428f6b0fd87955e683 not found: ID does not exist" Oct 07 15:28:44 crc kubenswrapper[4959]: I1007 15:28:44.809602 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:28:44 crc kubenswrapper[4959]: E1007 15:28:44.810447 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:28:44 crc kubenswrapper[4959]: I1007 15:28:44.822681 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" path="/var/lib/kubelet/pods/56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7/volumes" Oct 07 15:28:54 crc kubenswrapper[4959]: I1007 15:28:54.461305 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w4dpf_f8122128-1530-410d-a26b-068922cea39b/control-plane-machine-set-operator/0.log" Oct 07 15:28:54 crc kubenswrapper[4959]: I1007 15:28:54.639868 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c5bnk_d0203e72-df97-4a97-8f45-65175f7d9839/kube-rbac-proxy/0.log" Oct 07 15:28:54 crc kubenswrapper[4959]: I1007 15:28:54.669957 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c5bnk_d0203e72-df97-4a97-8f45-65175f7d9839/machine-api-operator/0.log" Oct 07 15:28:55 crc kubenswrapper[4959]: I1007 15:28:55.808840 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:28:55 crc kubenswrapper[4959]: E1007 15:28:55.809429 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.070933 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:05 crc kubenswrapper[4959]: E1007 15:29:05.072589 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="extract-utilities" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.072611 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="extract-utilities" Oct 07 15:29:05 crc kubenswrapper[4959]: E1007 15:29:05.072671 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="extract-content" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.072681 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="extract-content" Oct 07 15:29:05 crc kubenswrapper[4959]: E1007 15:29:05.072696 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="registry-server" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.072705 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="registry-server" Oct 07 15:29:05 crc kubenswrapper[4959]: E1007 15:29:05.072741 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" containerName="container-00" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.072749 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" containerName="container-00" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.073307 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="788d2d4c-be2a-4f3a-bd40-9ac041bc5f54" containerName="container-00" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.073352 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fb45ab-f91f-48c2-8bc7-0353a9e3c4b7" containerName="registry-server" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.086238 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.103235 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.214311 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.214393 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.214511 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzng\" (UniqueName: \"kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.316576 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.316670 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.316725 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzng\" (UniqueName: \"kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.317361 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.317419 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.339788 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzng\" (UniqueName: \"kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng\") pod \"redhat-marketplace-vhghw\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.414295 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:05 crc kubenswrapper[4959]: I1007 15:29:05.962907 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:06 crc kubenswrapper[4959]: I1007 15:29:06.326641 4959 generic.go:334] "Generic (PLEG): container finished" podID="946a8415-202e-41e8-b389-76d0feb98773" containerID="6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8" exitCode=0 Oct 07 15:29:06 crc kubenswrapper[4959]: I1007 15:29:06.326726 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerDied","Data":"6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8"} Oct 07 15:29:06 crc kubenswrapper[4959]: I1007 15:29:06.327028 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerStarted","Data":"6bac3938ba7458e8b15663e5986ce154ae2d5d6eedc4f1a6b6fb361407671b52"} Oct 07 15:29:08 crc kubenswrapper[4959]: I1007 15:29:08.351459 4959 generic.go:334] "Generic (PLEG): container finished" podID="946a8415-202e-41e8-b389-76d0feb98773" containerID="7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97" exitCode=0 Oct 07 15:29:08 crc kubenswrapper[4959]: I1007 15:29:08.351552 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerDied","Data":"7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97"} Oct 07 15:29:08 crc kubenswrapper[4959]: I1007 15:29:08.591492 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-d6468_fe93de9f-3c30-4373-bc80-912dd219d1f9/cert-manager-controller/0.log" Oct 07 15:29:08 crc kubenswrapper[4959]: I1007 15:29:08.800002 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-585vd_897ad114-2a60-468e-8c81-2367ded7fe7b/cert-manager-cainjector/0.log" Oct 07 15:29:08 crc kubenswrapper[4959]: I1007 15:29:08.842505 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hp7q8_617c6991-922b-4bd1-b578-2327061ba973/cert-manager-webhook/0.log" Oct 07 15:29:09 crc kubenswrapper[4959]: I1007 15:29:09.366864 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerStarted","Data":"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6"} Oct 07 15:29:09 crc kubenswrapper[4959]: I1007 15:29:09.399785 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vhghw" podStartSLOduration=1.864597169 podStartE2EDuration="4.399750858s" podCreationTimestamp="2025-10-07 15:29:05 +0000 UTC" firstStartedPulling="2025-10-07 15:29:06.328615673 +0000 UTC m=+8898.489338350" lastFinishedPulling="2025-10-07 15:29:08.863769362 +0000 UTC m=+8901.024492039" observedRunningTime="2025-10-07 15:29:09.387734963 +0000 UTC m=+8901.548457660" watchObservedRunningTime="2025-10-07 15:29:09.399750858 +0000 UTC m=+8901.560473535" Oct 07 15:29:10 crc kubenswrapper[4959]: I1007 15:29:10.809443 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:29:10 crc kubenswrapper[4959]: E1007 15:29:10.811752 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:29:15 crc kubenswrapper[4959]: I1007 15:29:15.415386 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:15 crc kubenswrapper[4959]: I1007 15:29:15.416251 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:15 crc kubenswrapper[4959]: I1007 15:29:15.477929 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:15 crc kubenswrapper[4959]: I1007 15:29:15.541909 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:15 crc kubenswrapper[4959]: I1007 15:29:15.717796 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:17 crc kubenswrapper[4959]: I1007 15:29:17.452881 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vhghw" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="registry-server" containerID="cri-o://36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6" gracePeriod=2 Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.070955 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.136972 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content\") pod \"946a8415-202e-41e8-b389-76d0feb98773\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.137273 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzng\" (UniqueName: \"kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng\") pod \"946a8415-202e-41e8-b389-76d0feb98773\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.137488 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities\") pod \"946a8415-202e-41e8-b389-76d0feb98773\" (UID: \"946a8415-202e-41e8-b389-76d0feb98773\") " Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.138952 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities" (OuterVolumeSpecName: "utilities") pod "946a8415-202e-41e8-b389-76d0feb98773" (UID: "946a8415-202e-41e8-b389-76d0feb98773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.145512 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng" (OuterVolumeSpecName: "kube-api-access-nkzng") pod "946a8415-202e-41e8-b389-76d0feb98773" (UID: "946a8415-202e-41e8-b389-76d0feb98773"). InnerVolumeSpecName "kube-api-access-nkzng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.159083 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "946a8415-202e-41e8-b389-76d0feb98773" (UID: "946a8415-202e-41e8-b389-76d0feb98773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.241611 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.241884 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkzng\" (UniqueName: \"kubernetes.io/projected/946a8415-202e-41e8-b389-76d0feb98773-kube-api-access-nkzng\") on node \"crc\" DevicePath \"\"" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.241899 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946a8415-202e-41e8-b389-76d0feb98773-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.466458 4959 generic.go:334] "Generic (PLEG): container finished" podID="946a8415-202e-41e8-b389-76d0feb98773" containerID="36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6" exitCode=0 Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.466529 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerDied","Data":"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6"} Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.466552 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhghw" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.466578 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhghw" event={"ID":"946a8415-202e-41e8-b389-76d0feb98773","Type":"ContainerDied","Data":"6bac3938ba7458e8b15663e5986ce154ae2d5d6eedc4f1a6b6fb361407671b52"} Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.466608 4959 scope.go:117] "RemoveContainer" containerID="36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.492929 4959 scope.go:117] "RemoveContainer" containerID="7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.507436 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.540315 4959 scope.go:117] "RemoveContainer" containerID="6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.545210 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhghw"] Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.602797 4959 scope.go:117] "RemoveContainer" containerID="36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6" Oct 07 15:29:18 crc kubenswrapper[4959]: E1007 15:29:18.603223 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6\": container with ID starting with 36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6 not found: ID does not exist" containerID="36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.603261 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6"} err="failed to get container status \"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6\": rpc error: code = NotFound desc = could not find container \"36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6\": container with ID starting with 36ef2bc6d8b6eb8352550a369f8af54da3bb5e030c723a8a9dffd09ca88f1aa6 not found: ID does not exist" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.603285 4959 scope.go:117] "RemoveContainer" containerID="7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97" Oct 07 15:29:18 crc kubenswrapper[4959]: E1007 15:29:18.603479 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97\": container with ID starting with 7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97 not found: ID does not exist" containerID="7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.603518 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97"} err="failed to get container status \"7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97\": rpc error: code = NotFound desc = could not find container \"7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97\": container with ID starting with 7cc5bfbbe18b303040eb8cbf50173924c6c5dcb4889bcf44ae91bace3775fb97 not found: ID does not exist" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.603539 4959 scope.go:117] "RemoveContainer" containerID="6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8" Oct 07 15:29:18 crc kubenswrapper[4959]: E1007 15:29:18.603800 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8\": container with ID starting with 6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8 not found: ID does not exist" containerID="6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.603824 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8"} err="failed to get container status \"6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8\": rpc error: code = NotFound desc = could not find container \"6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8\": container with ID starting with 6f987533e445445f2a426a1894ef946b6b48268e0dfa6803864126bd2202eee8 not found: ID does not exist" Oct 07 15:29:18 crc kubenswrapper[4959]: I1007 15:29:18.820757 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946a8415-202e-41e8-b389-76d0feb98773" path="/var/lib/kubelet/pods/946a8415-202e-41e8-b389-76d0feb98773/volumes" Oct 07 15:29:22 crc kubenswrapper[4959]: I1007 15:29:22.723912 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-bc4dg_e91d7e0c-6f6c-4305-88f1-316fda279894/nmstate-console-plugin/0.log" Oct 07 15:29:22 crc kubenswrapper[4959]: I1007 15:29:22.957338 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-trnlr_0ce82e41-c87c-4a6a-85c5-63fa6986a917/nmstate-handler/0.log" Oct 07 15:29:23 crc kubenswrapper[4959]: I1007 15:29:23.038406 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-bqt8v_b74adf76-6b8e-4df4-a786-b241afc85aaf/nmstate-metrics/0.log" Oct 07 15:29:23 crc kubenswrapper[4959]: I1007 15:29:23.039008 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-bqt8v_b74adf76-6b8e-4df4-a786-b241afc85aaf/kube-rbac-proxy/0.log" Oct 07 15:29:23 crc kubenswrapper[4959]: I1007 15:29:23.271037 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-qxq76_ecfe0080-5a6d-4580-957f-9b07016a6f38/nmstate-operator/0.log" Oct 07 15:29:23 crc kubenswrapper[4959]: I1007 15:29:23.285337 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-648vt_b2b69213-b0ec-4fe5-ba7a-cfc1d32fdbdb/nmstate-webhook/0.log" Oct 07 15:29:23 crc kubenswrapper[4959]: E1007 15:29:23.809476 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:29:24 crc kubenswrapper[4959]: I1007 15:29:24.809367 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:29:24 crc kubenswrapper[4959]: E1007 15:29:24.810612 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.107763 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7b7r7_ec5e2185-a03f-459b-95ce-cf8a04c9742d/kube-rbac-proxy/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.215792 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7b7r7_ec5e2185-a03f-459b-95ce-cf8a04c9742d/controller/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.282668 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.455876 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.480923 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.486754 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.543425 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.682228 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.684305 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.731767 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.762393 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.817108 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:29:38 crc kubenswrapper[4959]: E1007 15:29:38.817355 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.949163 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-metrics/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.950550 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-reloader/0.log" Oct 07 15:29:38 crc kubenswrapper[4959]: I1007 15:29:38.977390 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/cp-frr-files/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.017867 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/controller/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.129592 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/frr-metrics/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.212590 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/kube-rbac-proxy/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.299305 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/kube-rbac-proxy-frr/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.378470 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/reloader/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.509095 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-thbcx_ea4413f6-7433-4301-856f-51073cbf20b0/frr-k8s-webhook-server/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.765221 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79dcdc88ff-jv2fl_c2450601-6bf6-4ee8-af46-dafc0db98d8c/manager/0.log" Oct 07 15:29:39 crc kubenswrapper[4959]: I1007 15:29:39.966080 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-678c485567-fb7ps_d1fbc67f-df69-48a4-87a0-e9d429eca6f1/webhook-server/0.log" Oct 07 15:29:40 crc kubenswrapper[4959]: I1007 15:29:40.147133 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dgrrl_4a978343-2c48-4153-a20e-631bbe3c1595/kube-rbac-proxy/0.log" Oct 07 15:29:40 crc kubenswrapper[4959]: I1007 15:29:40.864386 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dgrrl_4a978343-2c48-4153-a20e-631bbe3c1595/speaker/0.log" Oct 07 15:29:41 crc kubenswrapper[4959]: I1007 15:29:41.570769 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cf76d_a786d6e0-64e4-4bfb-a93a-673b9d775053/frr/0.log" Oct 07 15:29:50 crc kubenswrapper[4959]: I1007 15:29:50.809696 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:29:50 crc kubenswrapper[4959]: E1007 15:29:50.810711 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:29:53 crc kubenswrapper[4959]: I1007 15:29:53.550042 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:29:53 crc kubenswrapper[4959]: I1007 15:29:53.759422 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:29:53 crc kubenswrapper[4959]: I1007 15:29:53.791343 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:29:53 crc kubenswrapper[4959]: I1007 15:29:53.876259 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.027402 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/util/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.051327 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/pull/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.106118 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2t8lmb_927d2a77-5bce-4896-aec7-67a816a96bf0/extract/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.236229 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.480664 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.482331 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.520541 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.674503 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-utilities/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.674512 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/extract-content/0.log" Oct 07 15:29:54 crc kubenswrapper[4959]: I1007 15:29:54.906989 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.073612 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rx5f5_bb339141-77d5-4681-bc7a-549f37140f3c/registry-server/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.180030 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.215502 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.254157 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.479197 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-content/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.517805 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/extract-utilities/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.782395 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:29:55 crc kubenswrapper[4959]: I1007 15:29:55.972501 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.051249 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.070647 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.226863 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/util/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.363809 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/pull/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.391904 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6dl2q_ba266f6d-03a0-4e6f-b1fc-be71300ce515/extract/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.663112 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fbtxf_0548f538-781a-406b-8d2c-4449281cc77c/marketplace-operator/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.709822 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p6jh_0714844d-6d03-4a23-9611-e02495624e6d/registry-server/0.log" Oct 07 15:29:56 crc kubenswrapper[4959]: I1007 15:29:56.920526 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.158344 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.205317 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.217397 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.433259 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-content/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.494262 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/extract-utilities/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.549592 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.692094 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kk98t_6f310322-04af-455b-9cbc-d49bd49aec71/registry-server/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.817017 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.818372 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:29:57 crc kubenswrapper[4959]: I1007 15:29:57.847425 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:29:58 crc kubenswrapper[4959]: I1007 15:29:58.000335 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-utilities/0.log" Oct 07 15:29:58 crc kubenswrapper[4959]: I1007 15:29:58.119648 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/extract-content/0.log" Oct 07 15:29:58 crc kubenswrapper[4959]: I1007 15:29:58.524320 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2nls2_73f8e5f0-4948-4c4a-96d6-1e0ea1ffcd8b/registry-server/0.log" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.147740 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw"] Oct 07 15:30:00 crc kubenswrapper[4959]: E1007 15:30:00.148655 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="registry-server" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.148674 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="registry-server" Oct 07 15:30:00 crc kubenswrapper[4959]: E1007 15:30:00.148688 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="extract-utilities" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.148695 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="extract-utilities" Oct 07 15:30:00 crc kubenswrapper[4959]: E1007 15:30:00.148724 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="extract-content" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.148732 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="extract-content" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.148996 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="946a8415-202e-41e8-b389-76d0feb98773" containerName="registry-server" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.149979 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.155307 4959 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.155596 4959 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.185892 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw"] Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.219255 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.219789 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.219972 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mv5z\" (UniqueName: \"kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.321967 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.322046 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mv5z\" (UniqueName: \"kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.322131 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.323146 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.339984 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mv5z\" (UniqueName: \"kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.346287 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume\") pod \"collect-profiles-29330850-r8dxw\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:00 crc kubenswrapper[4959]: I1007 15:30:00.482568 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:01 crc kubenswrapper[4959]: I1007 15:30:01.031710 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw"] Oct 07 15:30:01 crc kubenswrapper[4959]: I1007 15:30:01.930254 4959 generic.go:334] "Generic (PLEG): container finished" podID="47cc8573-e2fc-4902-ba62-75782a56de6f" containerID="032d88568c2c70e2d4ac1ed1e72f18f28312b31f708d0dc746acc830521c31ae" exitCode=0 Oct 07 15:30:01 crc kubenswrapper[4959]: I1007 15:30:01.930763 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" event={"ID":"47cc8573-e2fc-4902-ba62-75782a56de6f","Type":"ContainerDied","Data":"032d88568c2c70e2d4ac1ed1e72f18f28312b31f708d0dc746acc830521c31ae"} Oct 07 15:30:01 crc kubenswrapper[4959]: I1007 15:30:01.930790 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" event={"ID":"47cc8573-e2fc-4902-ba62-75782a56de6f","Type":"ContainerStarted","Data":"6c6395d121daaf04ec89434846643cd8752bdbf33029aca8c76c44367dd43e1d"} Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.297764 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.389460 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume\") pod \"47cc8573-e2fc-4902-ba62-75782a56de6f\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.389745 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume\") pod \"47cc8573-e2fc-4902-ba62-75782a56de6f\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.389838 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mv5z\" (UniqueName: \"kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z\") pod \"47cc8573-e2fc-4902-ba62-75782a56de6f\" (UID: \"47cc8573-e2fc-4902-ba62-75782a56de6f\") " Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.390429 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume" (OuterVolumeSpecName: "config-volume") pod "47cc8573-e2fc-4902-ba62-75782a56de6f" (UID: "47cc8573-e2fc-4902-ba62-75782a56de6f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.396237 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47cc8573-e2fc-4902-ba62-75782a56de6f" (UID: "47cc8573-e2fc-4902-ba62-75782a56de6f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.400995 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z" (OuterVolumeSpecName: "kube-api-access-8mv5z") pod "47cc8573-e2fc-4902-ba62-75782a56de6f" (UID: "47cc8573-e2fc-4902-ba62-75782a56de6f"). InnerVolumeSpecName "kube-api-access-8mv5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.491900 4959 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47cc8573-e2fc-4902-ba62-75782a56de6f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.491950 4959 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47cc8573-e2fc-4902-ba62-75782a56de6f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.491965 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mv5z\" (UniqueName: \"kubernetes.io/projected/47cc8573-e2fc-4902-ba62-75782a56de6f-kube-api-access-8mv5z\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.809576 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:30:03 crc kubenswrapper[4959]: E1007 15:30:03.809870 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.954008 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" event={"ID":"47cc8573-e2fc-4902-ba62-75782a56de6f","Type":"ContainerDied","Data":"6c6395d121daaf04ec89434846643cd8752bdbf33029aca8c76c44367dd43e1d"} Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.954052 4959 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c6395d121daaf04ec89434846643cd8752bdbf33029aca8c76c44367dd43e1d" Oct 07 15:30:03 crc kubenswrapper[4959]: I1007 15:30:03.954071 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-r8dxw" Oct 07 15:30:04 crc kubenswrapper[4959]: I1007 15:30:04.369249 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn"] Oct 07 15:30:04 crc kubenswrapper[4959]: I1007 15:30:04.380925 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-km6jn"] Oct 07 15:30:04 crc kubenswrapper[4959]: I1007 15:30:04.822692 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c7f350-283b-4644-92cc-0d1546edfe88" path="/var/lib/kubelet/pods/82c7f350-283b-4644-92cc-0d1546edfe88/volumes" Oct 07 15:30:15 crc kubenswrapper[4959]: I1007 15:30:15.809529 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:30:15 crc kubenswrapper[4959]: E1007 15:30:15.811237 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:30:26 crc kubenswrapper[4959]: I1007 15:30:26.816382 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:30:26 crc kubenswrapper[4959]: E1007 15:30:26.817566 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:30:39 crc kubenswrapper[4959]: I1007 15:30:39.809679 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:30:39 crc kubenswrapper[4959]: E1007 15:30:39.811846 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:30:48 crc kubenswrapper[4959]: E1007 15:30:48.840815 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:30:52 crc kubenswrapper[4959]: I1007 15:30:52.808775 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:30:52 crc kubenswrapper[4959]: E1007 15:30:52.810685 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:30:58 crc kubenswrapper[4959]: I1007 15:30:58.089103 4959 scope.go:117] "RemoveContainer" containerID="32ee6979849600466f7bd44656735fa0866922eaab2f154fb0a5898721b12478" Oct 07 15:31:06 crc kubenswrapper[4959]: I1007 15:31:06.809141 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:31:06 crc kubenswrapper[4959]: E1007 15:31:06.809988 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:31:18 crc kubenswrapper[4959]: I1007 15:31:18.820000 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:31:18 crc kubenswrapper[4959]: E1007 15:31:18.821356 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:31:29 crc kubenswrapper[4959]: I1007 15:31:29.808363 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:31:29 crc kubenswrapper[4959]: E1007 15:31:29.809452 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:31:42 crc kubenswrapper[4959]: I1007 15:31:42.809368 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:31:42 crc kubenswrapper[4959]: E1007 15:31:42.811938 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:31:54 crc kubenswrapper[4959]: I1007 15:31:54.810617 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:31:54 crc kubenswrapper[4959]: E1007 15:31:54.812071 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:31:58 crc kubenswrapper[4959]: I1007 15:31:58.149753 4959 scope.go:117] "RemoveContainer" containerID="2943187e787f6d0bbcd4f9948fe9e9cbfde571dd76dcf7a9c494b8ad8d131829" Oct 07 15:32:05 crc kubenswrapper[4959]: I1007 15:32:05.808702 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:32:05 crc kubenswrapper[4959]: E1007 15:32:05.810879 4959 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dgmtp_openshift-machine-config-operator(4cbefab5-1f50-4f44-9163-479625fa11a4)\"" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" Oct 07 15:32:11 crc kubenswrapper[4959]: E1007 15:32:11.808657 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:32:17 crc kubenswrapper[4959]: I1007 15:32:17.809675 4959 scope.go:117] "RemoveContainer" containerID="8fe96918f0ef75b8c97a362216321ea54fa2951e718467eb7fa99327e0581e38" Oct 07 15:32:18 crc kubenswrapper[4959]: I1007 15:32:18.517939 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" event={"ID":"4cbefab5-1f50-4f44-9163-479625fa11a4","Type":"ContainerStarted","Data":"89abd8cfc6167fd79b12728327350a529adbf19a03aa5cdf7ddef3904292d47f"} Oct 07 15:32:50 crc kubenswrapper[4959]: I1007 15:32:50.913772 4959 generic.go:334] "Generic (PLEG): container finished" podID="bcf6dce6-035d-4780-8841-08fa857032f9" containerID="15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab" exitCode=0 Oct 07 15:32:50 crc kubenswrapper[4959]: I1007 15:32:50.913879 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" event={"ID":"bcf6dce6-035d-4780-8841-08fa857032f9","Type":"ContainerDied","Data":"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab"} Oct 07 15:32:50 crc kubenswrapper[4959]: I1007 15:32:50.915608 4959 scope.go:117] "RemoveContainer" containerID="15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab" Oct 07 15:32:51 crc kubenswrapper[4959]: I1007 15:32:51.656435 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hc7gg_must-gather-n2n6r_bcf6dce6-035d-4780-8841-08fa857032f9/gather/0.log" Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.238355 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hc7gg/must-gather-n2n6r"] Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.239268 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="copy" containerID="cri-o://4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a" gracePeriod=2 Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.249583 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hc7gg/must-gather-n2n6r"] Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.748499 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hc7gg_must-gather-n2n6r_bcf6dce6-035d-4780-8841-08fa857032f9/copy/0.log" Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.749400 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.847095 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdvl\" (UniqueName: \"kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl\") pod \"bcf6dce6-035d-4780-8841-08fa857032f9\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.847670 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output\") pod \"bcf6dce6-035d-4780-8841-08fa857032f9\" (UID: \"bcf6dce6-035d-4780-8841-08fa857032f9\") " Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.864979 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl" (OuterVolumeSpecName: "kube-api-access-6tdvl") pod "bcf6dce6-035d-4780-8841-08fa857032f9" (UID: "bcf6dce6-035d-4780-8841-08fa857032f9"). InnerVolumeSpecName "kube-api-access-6tdvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:33:08 crc kubenswrapper[4959]: I1007 15:33:08.950708 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdvl\" (UniqueName: \"kubernetes.io/projected/bcf6dce6-035d-4780-8841-08fa857032f9-kube-api-access-6tdvl\") on node \"crc\" DevicePath \"\"" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.093732 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bcf6dce6-035d-4780-8841-08fa857032f9" (UID: "bcf6dce6-035d-4780-8841-08fa857032f9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.120952 4959 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hc7gg_must-gather-n2n6r_bcf6dce6-035d-4780-8841-08fa857032f9/copy/0.log" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.121482 4959 generic.go:334] "Generic (PLEG): container finished" podID="bcf6dce6-035d-4780-8841-08fa857032f9" containerID="4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a" exitCode=143 Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.121585 4959 scope.go:117] "RemoveContainer" containerID="4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.121589 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hc7gg/must-gather-n2n6r" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.157006 4959 scope.go:117] "RemoveContainer" containerID="15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.160674 4959 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcf6dce6-035d-4780-8841-08fa857032f9-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.262942 4959 scope.go:117] "RemoveContainer" containerID="4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a" Oct 07 15:33:09 crc kubenswrapper[4959]: E1007 15:33:09.263951 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a\": container with ID starting with 4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a not found: ID does not exist" containerID="4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.264004 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a"} err="failed to get container status \"4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a\": rpc error: code = NotFound desc = could not find container \"4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a\": container with ID starting with 4dda7e13b8ee560eb9f6d47a42546c087aa1d1d563d0b967ffcde1cb21710a0a not found: ID does not exist" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.264037 4959 scope.go:117] "RemoveContainer" containerID="15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab" Oct 07 15:33:09 crc kubenswrapper[4959]: E1007 15:33:09.264404 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab\": container with ID starting with 15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab not found: ID does not exist" containerID="15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab" Oct 07 15:33:09 crc kubenswrapper[4959]: I1007 15:33:09.264442 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab"} err="failed to get container status \"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab\": rpc error: code = NotFound desc = could not find container \"15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab\": container with ID starting with 15096550693fa797c5205c015505386f2adbc00951fcc0aee0e96a75b87ed2ab not found: ID does not exist" Oct 07 15:33:10 crc kubenswrapper[4959]: I1007 15:33:10.822745 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" path="/var/lib/kubelet/pods/bcf6dce6-035d-4780-8841-08fa857032f9/volumes" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.436651 4959 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:28 crc kubenswrapper[4959]: E1007 15:33:28.438259 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cc8573-e2fc-4902-ba62-75782a56de6f" containerName="collect-profiles" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438289 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cc8573-e2fc-4902-ba62-75782a56de6f" containerName="collect-profiles" Oct 07 15:33:28 crc kubenswrapper[4959]: E1007 15:33:28.438314 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="copy" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438346 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="copy" Oct 07 15:33:28 crc kubenswrapper[4959]: E1007 15:33:28.438384 4959 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="gather" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438394 4959 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="gather" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438690 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cc8573-e2fc-4902-ba62-75782a56de6f" containerName="collect-profiles" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438716 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="gather" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.438734 4959 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf6dce6-035d-4780-8841-08fa857032f9" containerName="copy" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.440739 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.454847 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.563596 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhbq\" (UniqueName: \"kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.565063 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.565292 4959 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.670951 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhbq\" (UniqueName: \"kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.671714 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.671921 4959 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.676765 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.677241 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.709810 4959 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhbq\" (UniqueName: \"kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq\") pod \"redhat-operators-g7cc6\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:28 crc kubenswrapper[4959]: I1007 15:33:28.763191 4959 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:29 crc kubenswrapper[4959]: I1007 15:33:29.264661 4959 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:29 crc kubenswrapper[4959]: I1007 15:33:29.375595 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerStarted","Data":"0f5bf0c77ccb509fd3502198f3f214f9226a0a3e21df3c68deff7271fcead05c"} Oct 07 15:33:29 crc kubenswrapper[4959]: E1007 15:33:29.812536 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:33:30 crc kubenswrapper[4959]: I1007 15:33:30.388591 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b06d73d-631e-48e6-8220-3ffa74056983" containerID="c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b" exitCode=0 Oct 07 15:33:30 crc kubenswrapper[4959]: I1007 15:33:30.388706 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerDied","Data":"c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b"} Oct 07 15:33:30 crc kubenswrapper[4959]: I1007 15:33:30.391686 4959 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:33:32 crc kubenswrapper[4959]: I1007 15:33:32.421455 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerStarted","Data":"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88"} Oct 07 15:33:34 crc kubenswrapper[4959]: I1007 15:33:34.443545 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b06d73d-631e-48e6-8220-3ffa74056983" containerID="0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88" exitCode=0 Oct 07 15:33:34 crc kubenswrapper[4959]: I1007 15:33:34.444012 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerDied","Data":"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88"} Oct 07 15:33:35 crc kubenswrapper[4959]: I1007 15:33:35.457672 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerStarted","Data":"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5"} Oct 07 15:33:35 crc kubenswrapper[4959]: I1007 15:33:35.482703 4959 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7cc6" podStartSLOduration=2.9980639350000002 podStartE2EDuration="7.482676348s" podCreationTimestamp="2025-10-07 15:33:28 +0000 UTC" firstStartedPulling="2025-10-07 15:33:30.391399081 +0000 UTC m=+9162.552121758" lastFinishedPulling="2025-10-07 15:33:34.876011494 +0000 UTC m=+9167.036734171" observedRunningTime="2025-10-07 15:33:35.480882527 +0000 UTC m=+9167.641605224" watchObservedRunningTime="2025-10-07 15:33:35.482676348 +0000 UTC m=+9167.643399045" Oct 07 15:33:38 crc kubenswrapper[4959]: I1007 15:33:38.763571 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:38 crc kubenswrapper[4959]: I1007 15:33:38.769058 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:39 crc kubenswrapper[4959]: I1007 15:33:39.850217 4959 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7cc6" podUID="8b06d73d-631e-48e6-8220-3ffa74056983" containerName="registry-server" probeResult="failure" output=< Oct 07 15:33:39 crc kubenswrapper[4959]: timeout: failed to connect service ":50051" within 1s Oct 07 15:33:39 crc kubenswrapper[4959]: > Oct 07 15:33:48 crc kubenswrapper[4959]: I1007 15:33:48.828516 4959 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:48 crc kubenswrapper[4959]: I1007 15:33:48.884587 4959 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:49 crc kubenswrapper[4959]: I1007 15:33:49.069308 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:50 crc kubenswrapper[4959]: I1007 15:33:50.702337 4959 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7cc6" podUID="8b06d73d-631e-48e6-8220-3ffa74056983" containerName="registry-server" containerID="cri-o://396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5" gracePeriod=2 Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.220133 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.312345 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content\") pod \"8b06d73d-631e-48e6-8220-3ffa74056983\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.312436 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nhbq\" (UniqueName: \"kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq\") pod \"8b06d73d-631e-48e6-8220-3ffa74056983\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.313330 4959 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities\") pod \"8b06d73d-631e-48e6-8220-3ffa74056983\" (UID: \"8b06d73d-631e-48e6-8220-3ffa74056983\") " Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.315149 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities" (OuterVolumeSpecName: "utilities") pod "8b06d73d-631e-48e6-8220-3ffa74056983" (UID: "8b06d73d-631e-48e6-8220-3ffa74056983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.321991 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq" (OuterVolumeSpecName: "kube-api-access-6nhbq") pod "8b06d73d-631e-48e6-8220-3ffa74056983" (UID: "8b06d73d-631e-48e6-8220-3ffa74056983"). InnerVolumeSpecName "kube-api-access-6nhbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.415977 4959 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nhbq\" (UniqueName: \"kubernetes.io/projected/8b06d73d-631e-48e6-8220-3ffa74056983-kube-api-access-6nhbq\") on node \"crc\" DevicePath \"\"" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.416020 4959 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.437914 4959 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b06d73d-631e-48e6-8220-3ffa74056983" (UID: "8b06d73d-631e-48e6-8220-3ffa74056983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.518518 4959 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b06d73d-631e-48e6-8220-3ffa74056983-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.714389 4959 generic.go:334] "Generic (PLEG): container finished" podID="8b06d73d-631e-48e6-8220-3ffa74056983" containerID="396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5" exitCode=0 Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.714452 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerDied","Data":"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5"} Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.714488 4959 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7cc6" event={"ID":"8b06d73d-631e-48e6-8220-3ffa74056983","Type":"ContainerDied","Data":"0f5bf0c77ccb509fd3502198f3f214f9226a0a3e21df3c68deff7271fcead05c"} Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.714488 4959 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7cc6" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.714552 4959 scope.go:117] "RemoveContainer" containerID="396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.760364 4959 scope.go:117] "RemoveContainer" containerID="0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.779205 4959 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.792136 4959 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7cc6"] Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.796215 4959 scope.go:117] "RemoveContainer" containerID="c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.851462 4959 scope.go:117] "RemoveContainer" containerID="396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5" Oct 07 15:33:51 crc kubenswrapper[4959]: E1007 15:33:51.852174 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5\": container with ID starting with 396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5 not found: ID does not exist" containerID="396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.852214 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5"} err="failed to get container status \"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5\": rpc error: code = NotFound desc = could not find container \"396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5\": container with ID starting with 396e2a5d5600c5ed80ce70a250c6391a77e94cb3461033c43a36baaa05c124c5 not found: ID does not exist" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.852246 4959 scope.go:117] "RemoveContainer" containerID="0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88" Oct 07 15:33:51 crc kubenswrapper[4959]: E1007 15:33:51.852817 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88\": container with ID starting with 0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88 not found: ID does not exist" containerID="0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.853012 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88"} err="failed to get container status \"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88\": rpc error: code = NotFound desc = could not find container \"0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88\": container with ID starting with 0c862aecf380b7b59b7e3fb6b0f8266e77b84b7e4f19b920f64a790210528f88 not found: ID does not exist" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.853121 4959 scope.go:117] "RemoveContainer" containerID="c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b" Oct 07 15:33:51 crc kubenswrapper[4959]: E1007 15:33:51.853713 4959 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b\": container with ID starting with c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b not found: ID does not exist" containerID="c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b" Oct 07 15:33:51 crc kubenswrapper[4959]: I1007 15:33:51.853761 4959 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b"} err="failed to get container status \"c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b\": rpc error: code = NotFound desc = could not find container \"c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b\": container with ID starting with c4cc423f9d48aa8180c6f2bfc49137e78bc3e99312573184da208b20df97347b not found: ID does not exist" Oct 07 15:33:52 crc kubenswrapper[4959]: I1007 15:33:52.822092 4959 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b06d73d-631e-48e6-8220-3ffa74056983" path="/var/lib/kubelet/pods/8b06d73d-631e-48e6-8220-3ffa74056983/volumes" Oct 07 15:34:31 crc kubenswrapper[4959]: E1007 15:34:31.809472 4959 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 07 15:34:37 crc kubenswrapper[4959]: I1007 15:34:37.695874 4959 patch_prober.go:28] interesting pod/machine-config-daemon-dgmtp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:34:37 crc kubenswrapper[4959]: I1007 15:34:37.696776 4959 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dgmtp" podUID="4cbefab5-1f50-4f44-9163-479625fa11a4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515071231441024443 0ustar coreroot‹íÁ  ÷Om7 €7šÞ'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015071231442017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015071207001016476 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015071207001015446 5ustar corecore